Multiple Regression II KNNL Chapter 7 Extra Sums

  • Slides: 14
Download presentation
Multiple Regression II KNNL – Chapter 7

Multiple Regression II KNNL – Chapter 7

Extra Sums of Squares • For a given dataset, the total sum of squares

Extra Sums of Squares • For a given dataset, the total sum of squares remains the same, no matter what predictors are included (when no missing values exist among variables) • As we include more predictors, the regression sum of squares (SSR) increases (technically does not decrease), and the error sum of squares (SSE) decreases • SSR + SSE = SSTO, regardless of predictors in model • When a model contains just X 1, denote: SSR(X 1), SSE(X 1) • Model Containing X 1, X 2: SSR(X 1, X 2), SSE(X 1, X 2) • Predictive contribution of X 2 above that of X 1: SSR(X 2|X 1) = SSE(X 1) – SSE(X 1, X 2) = SSR(X 1, X 2) – SSR(X 1) • Extends to any number of Predictors

Definitions and Decomposition of SSR Note that as the # of predictors increases, so

Definitions and Decomposition of SSR Note that as the # of predictors increases, so does the ways of decomposing SSR

ANOVA – Sequential Sum of Squares

ANOVA – Sequential Sum of Squares

Extra Sums of Squares & Tests of Regression Coefficients (Single bk)

Extra Sums of Squares & Tests of Regression Coefficients (Single bk)

Extra Sums of Squares & Tests of Regression Coefficients (Multiple bk)

Extra Sums of Squares & Tests of Regression Coefficients (Multiple bk)

Extra Sums of Squares & Tests of Regression Coefficients (General Case)

Extra Sums of Squares & Tests of Regression Coefficients (General Case)

Other Linear Tests

Other Linear Tests

Coefficients of Partial Determination-I

Coefficients of Partial Determination-I

Coefficients of Partial Determination-II

Coefficients of Partial Determination-II

Standardized Regression Model - I • Useful in removing round-off errors in computing (X’X)-1

Standardized Regression Model - I • Useful in removing round-off errors in computing (X’X)-1 • Makes easier comparison of magnitude of effects of predictors measured on different measurement scales • Coefficients represent changes in Y (in standard deviation units) as each predictor increases 1 SD (holding all others constant) • Since all variables are centered, no intercept term

Standardized Regression Model - II

Standardized Regression Model - II

Standardized Regression Model - III

Standardized Regression Model - III

Multicollinearity • Consider model with 2 Predictors (this generalizes to any number of predictors)

Multicollinearity • Consider model with 2 Predictors (this generalizes to any number of predictors) Yi = b 0+b 1 Xi 1+b 2 Xi 2+ei • When X 1 and X 2 are uncorrelated, the regression coefficients b 1 and b 2 are the same whether we fit simple regressions or a multiple regression, and: SSR(X 1) = SSR(X 1|X 2) SSR(X 2) = SSR(X 2|X 1) • When X 1 and X 2 are highly correlated, their regression coefficients become unstable, and their standard errors become larger (smaller t-statistics, wider CIs), leading to strange inferences when comparing simple and partial effects of each predictor • Estimated means and Predicted values are not affected