Multiple Linear Regression and Correlation Analysis Chapter 14

  • Slides: 59
Download presentation
Multiple Linear Regression and Correlation Analysis Chapter 14 Mc. Graw-Hill/Irwin ©The Mc. Graw-Hill Companies,

Multiple Linear Regression and Correlation Analysis Chapter 14 Mc. Graw-Hill/Irwin ©The Mc. Graw-Hill Companies, Inc. 2008

GOALS 1. 2. 3. 4. 5. 6. 7. 8. 2 Describe the relationship between

GOALS 1. 2. 3. 4. 5. 6. 7. 8. 2 Describe the relationship between several independent variables and a dependent variable using multiple regression analysis. Set up, interpret, and apply an ANOVA table Compute and interpret the multiple standard error of estimate, the coefficient of multiple determination, and the adjusted coefficient of multiple determination. Conduct a test of hypothesis to determine whether regression coefficients differ from zero. Conduct a test of hypothesis on each of the regression coefficients. Use residual analysis to evaluate the assumptions of multiple regression analysis. Evaluate the effects of correlated independent variables. Use and understand qualitative independent variables.

Multiple Regression Analysis The general multiple regression with k independent variables is given by:

Multiple Regression Analysis The general multiple regression with k independent variables is given by: The least squares criterion is used to develop this equation. Because determining b 1, b 2, etc. is very tedious, a software package such as Excel or MINITAB is recommended. 3

Multiple Regression Analysis For two independent variables, the general form of the multiple regression

Multiple Regression Analysis For two independent variables, the general form of the multiple regression equation is: 4 • X 1 and X 2 are the independent variables. • a is the Y-intercept • b 1 is the net change in Y for each unit change in X 1 holding X 2 constant. It is called a partial regression coefficient, a net regression coefficient, or just a regression coefficient.

Regression Plane for a 2 -Independent Variable Linear Regression Equation 5

Regression Plane for a 2 -Independent Variable Linear Regression Equation 5

Multiple Linear Regression - Example Salsberry Realty sells homes along the east coast of

Multiple Linear Regression - Example Salsberry Realty sells homes along the east coast of the United States. One of the questions most frequently asked by prospective buyers is: If we purchase this home, how much can we expect to pay to heat it during the winter? The research department at Salsberry has been asked to develop some guidelines regarding heating costs for single-family homes. Three variables are thought to relate to the heating costs: (1) the mean daily outside temperature, (2) the number of inches of insulation in the attic, and (3) the age in years of the furnace. To investigate, Salsberry’s research department selected a random sample of 20 recently sold homes. It determined the cost to heat each home last January, as well (data in next slide) 6

Multiple Linear Regression - Example 7

Multiple Linear Regression - Example 7

Multiple Linear Regression – Minitab Example 8

Multiple Linear Regression – Minitab Example 8

Multiple Linear Regression – Excel Example 9

Multiple Linear Regression – Excel Example 9

The Multiple Regression Equation – Interpreting the Regression Coefficients The regression coefficient for mean

The Multiple Regression Equation – Interpreting the Regression Coefficients The regression coefficient for mean outside temperature (X 1) is 4. 583. The coefficient is negative and shows an inverse relationship between heating cost and temperature. As the outside temperature increases, the cost to heat the home decreases. The numeric value of the regression coefficient provides more information. If we increase temperature by 1 degree and hold the other two independent variables constant, we can estimate a decrease of $4. 583 in monthly heating cost. 10

The Multiple Regression Equation – Interpreting the Regression Coefficients The attic insulation variable (X

The Multiple Regression Equation – Interpreting the Regression Coefficients The attic insulation variable (X 2) also shows an inverse relationship: the more insulation in the attic, the less the cost to heat the home. So the negative sign for this coefficient is logical. For each additional inch of insulation, we expect the cost to heat the home to decline $14. 83 per month, regardless of the outside temperature or the age of the furnace. 11

The Multiple Regression Equation – Interpreting the Regression Coefficients The age of the furnace

The Multiple Regression Equation – Interpreting the Regression Coefficients The age of the furnace variable (X 3) shows a direct relationship. With an older furnace, the cost to heat the home increases. Specifically, for each additional year older the furnace is, we expect the cost to increase $6. 10 per month. 12

Applying the Model for Estimation What is the estimated heating cost for a home

Applying the Model for Estimation What is the estimated heating cost for a home if the mean outside temperature is 30 degrees, there are 5 inches of insulation in the attic, and the furnace is 10 years old? 13

Multiple Standard Error of Estimate The multiple standard error of estimate is a measure

Multiple Standard Error of Estimate The multiple standard error of estimate is a measure of the effectiveness of the regression equation. l It is measured in the same units as the dependent variable. l It is difficult to determine what is a large value and what is a small value of the standard error. l The formula is: 14

15

15

Multiple Regression and Correlation Assumptions l l 16 The independent variables and the dependent

Multiple Regression and Correlation Assumptions l l 16 The independent variables and the dependent variable have a linear relationship. The dependent variable must be continuous and at least intervalscale. The residual must be the same for all values of Y. When this is the case, we say the difference exhibits homoscedasticity. The residuals should follow the normal distribution with mean 0. Successive values of the dependent variable must be uncorrelated.

The ANOVA Table The ANOVA table reports the variation in the dependent variable. The

The ANOVA Table The ANOVA table reports the variation in the dependent variable. The variation is divided into two components. l The Explained Variation is that accounted for by the set of independent variable. l The Unexplained or Random Variation is not accounted for by the independent variables. 17

Minitab – the ANOVA Table 18

Minitab – the ANOVA Table 18

Coefficient of Multiple Determination (r 2) Characteristics of the coefficient of multiple determination: 1.

Coefficient of Multiple Determination (r 2) Characteristics of the coefficient of multiple determination: 1. It is symbolized by a capital R squared. In other words, it is written as because it behaves like the square of a correlation coefficient. 2. It can range from 0 to 1. A value near 0 indicates little association between the set of independent variables and the dependent variable. A value near 1 means a strong association. 3. It cannot assume negative values. Any number that is squared or raised to the second power cannot be negative. 4. It is easy to interpret. Because R 2 is a value between 0 and 1 it is easy to interpret, compare, and understand. 19

Coefficient of Multiple Determination (r 2) - Formula 20

Coefficient of Multiple Determination (r 2) - Formula 20

Minitab – the ANOVA Table 21

Minitab – the ANOVA Table 21

Adjusted Coefficient of Determination 22 l The number of independent variables in a multiple

Adjusted Coefficient of Determination 22 l The number of independent variables in a multiple regression equation makes the coefficient of determination larger. l If the number of variables, k, and the sample size, n, are equal, the coefficient of determination is 1. 0. l To balance the effect that the number of independent variables has on the coefficient of multiple determination, statistical software packages use an adjusted coefficient of multiple determination.

Adjusted Coefficient of Determination Example 23

Adjusted Coefficient of Determination Example 23

Correlation Matrix A correlation matrix is used to show all possible simple correlation coefficients

Correlation Matrix A correlation matrix is used to show all possible simple correlation coefficients among the variables. l The matrix is useful for locating correlated independent variables. l It shows how strongly each independent variable is correlated with the dependent variable. 24

Global Test: Testing the Multiple Regression Model The global test is used to investigate

Global Test: Testing the Multiple Regression Model The global test is used to investigate whether any of the independent variables have significant coefficients. The hypotheses are: 25

Global Test continued l The test statistic is the F distribution with k (number

Global Test continued l The test statistic is the F distribution with k (number of independent variables) and n-(k+1) degrees of freedom, where n is the sample size. l Decision Rule: Reject H 0 if F > F , k, n-k-1 26

Finding the Critical F 27

Finding the Critical F 27

Finding the Computed F 28

Finding the Computed F 28

Interpretation l l 29 The computed F is 21. 90, in the reject H

Interpretation l l 29 The computed F is 21. 90, in the reject H 0 region The null hypothesis that all the multiple regression coefficients are zero is therefore rejected. Interpretation: some of the independent variables (amount of insulation, etc. ) do have the ability to explain the variation in the dependent variable (heating cost). Logical question – which ones?

Evaluating Individual Regression Coefficients (βi = 0) l l 30 Test is used to

Evaluating Individual Regression Coefficients (βi = 0) l l 30 Test is used to determine which independent variables have nonzero regression coefficients. The variables that have zero regression coefficients are usually dropped from the analysis. The test statistic is the t distribution with n-(k+1) degrees of freedom. The hypothesis test is as follows: H 0: β i = 0 H 1: β i ≠ 0 Reject H 0 if t > t /2, n-k-1 or t < -t /2, n-k-1

Critical t-stat for the Slopes -2. 120 31 2. 120

Critical t-stat for the Slopes -2. 120 31 2. 120

Computed t-stat for the Slopes 32

Computed t-stat for the Slopes 32

Conclusion on Significance of Slopes 33

Conclusion on Significance of Slopes 33

New Regression Model without Variable “Age” – Minitab 34

New Regression Model without Variable “Age” – Minitab 34

New Regression Model without Variable “Age” – Minitab 35

New Regression Model without Variable “Age” – Minitab 35

Testing the New Model for Significance 36

Testing the New Model for Significance 36

Critical t-stat for the New Slopes -2. 110 37 2. 110

Critical t-stat for the New Slopes -2. 110 37 2. 110

Conclusion on Significance of New Slopes 38

Conclusion on Significance of New Slopes 38

Evaluating the Assumptions of Multiple Regression 1. There is a linear relationship. That is,

Evaluating the Assumptions of Multiple Regression 1. There is a linear relationship. That is, there is a straight-line relationship between the dependent variable and the set of independent variables. 2. The variation in the residuals is the same for both large and small values of the estimated Y To put it another way, the residual is unrelated whether the estimated Y is large or small. 3. The residuals follow the normal probability distribution. 4. The independent variables should not be correlated. That is, we would like to select a set of independent variables that are not themselves correlated. 5. The residuals are independent. This means that successive observations of the dependent variable are not correlated. This assumption is often violated when time is involved with the sampled observations. 39

Analysis of Residuals A residual is the difference between the actual value of Y

Analysis of Residuals A residual is the difference between the actual value of Y and the predicted value of Y. Residuals should be approximately normally distributed. Histograms and stem-and-leaf charts are useful in checking this requirement. l A plot of the residuals and their corresponding Y’ values is used for showing that there are no trends or patterns in the residuals. 40

Scatter Diagram 41

Scatter Diagram 41

Residual Plot 42

Residual Plot 42

Distribution of Residuals Both MINITAB and Excel offer another graph that helps to evaluate

Distribution of Residuals Both MINITAB and Excel offer another graph that helps to evaluate the assumption of normally distributed residuals. It is a called a normal probability plot and is shown to the right of the histogram. 43

Multicollinearity l l l 44 Multicollinearity exists when independent variables (X’s) are correlated. Correlated

Multicollinearity l l l 44 Multicollinearity exists when independent variables (X’s) are correlated. Correlated independent variables make it difficult to make inferences about the individual regression coefficients (slopes) and their individual effects on the dependent variable (Y). However, correlated independent variables do not affect a multiple regression equation’s ability to predict the dependent variable (Y).

Effect of Multicollinearity in l l 45 Not a Problem: Multicollinearity does not affect

Effect of Multicollinearity in l l 45 Not a Problem: Multicollinearity does not affect a multiple regression equation’s ability to predict the dependent variable A Problem: Multicollinearity may show unexpected results in evaluating the relationship between each independent variable and the dependent variable (a. k. a. partial correlation analysis),

Clues Indicating Problems with Multicollinearity 1. 2. 3. 46 An independent variable known to

Clues Indicating Problems with Multicollinearity 1. 2. 3. 46 An independent variable known to be an important predictor ends up having a regression coefficient that is not significant. A regression coefficient that should have a positive sign turns out to be negative, or vice versa. When an independent variable is added or removed, there is a drastic change in the values of the remaining regression coefficients.

Variance Inflation Factor l l l A general rule is if the correlation between

Variance Inflation Factor l l l A general rule is if the correlation between two independent variables is between -0. 70 and 0. 70 there likely is not a problem using both of the independent variables. A more precise test is to use the variance inflation factor (VIF). The value of VIF is found as follows: • The term R 2 j refers to the coefficient of determination, where the selected independent variable is used as a dependent variable and the remaining independent variables are used as independent variables. • A VIF greater than 10 is considered unsatisfactory, indicating that independent variable should be removed from the analysis. 47

Multicollinearity – Example Refer to the data in the table, which relates the heating

Multicollinearity – Example Refer to the data in the table, which relates the heating cost to the independent variables outside temperature, amount of insulation, and age of furnace. Develop a correlation matrix for all the independent variables. Does it appear there is a problem with multicollinearity? Find and interpret the variance inflation factor for each of the independent variables. 48

Correlation Matrix - Minitab 49

Correlation Matrix - Minitab 49

VIF – Minitab Example Coefficient of Determination The VIF value of 1. 32 is

VIF – Minitab Example Coefficient of Determination The VIF value of 1. 32 is less than the upper limit of 10. This indicates that the independent variable temperature is not strongly correlated with the other independent variables. 50

Independence Assumption l l 51 The fifth assumption about regression and correlation analysis is

Independence Assumption l l 51 The fifth assumption about regression and correlation analysis is that successive residuals should be independent. When successive residuals are correlated we refer to this condition as autocorrelation. Autocorrelation frequently occurs when the data are collected over a period of time.

Residual Plot versus Fitted Values l l 52 The graph shows the residuals plotted

Residual Plot versus Fitted Values l l 52 The graph shows the residuals plotted on the vertical axis and the fitted values on the horizontal axis. Note the run of residuals above the mean of the residuals, followed by a run below the mean. A scatter plot such as this would indicate possible autocorrelation.

Qualitative Independent Variables l l 53 Frequently we wish to use nominal-scale variables—such as

Qualitative Independent Variables l l 53 Frequently we wish to use nominal-scale variables—such as gender, whether the home has a swimming pool, or whether the sports team was the home or the visiting team—in our analysis. These are called qualitative variables. To use a qualitative variable in regression analysis, we use a scheme of dummy variables in which one of the two possible conditions is coded 0 and the other 1.

Qualitative Variable - Example Suppose in the Salsberry Realty example that the independent variable

Qualitative Variable - Example Suppose in the Salsberry Realty example that the independent variable “garage” is added. For those homes without an attached garage, 0 is used; for homes with an attached garage, a 1 is used. We will refer to the “garage” variable as X 4. The data shown on the table are entered into the MINITAB system. 54

Qualitative Variable - Minitab 55

Qualitative Variable - Minitab 55

Using the Model for Estimation What is the effect of the garage variable? Suppose

Using the Model for Estimation What is the effect of the garage variable? Suppose we have two houses exactly alike next to each other in Buffalo, New York; one has an attached garage, and the other does not. Both homes have 3 inches of insulation, and the mean January temperature in Buffalo is 20 degrees. For the house without an attached garage, a 0 is substituted for in the regression equation. The estimated heating cost is $280. 90, found by: For the house with an attached garage, a 1 is substituted for in the regression Without garage equation. The estimated heating cost is $358. 30, found by: With garage 56

Testing the Model for Significance l l 57 We have shown the difference between

Testing the Model for Significance l l 57 We have shown the difference between the two types of homes to be $77. 40, but is the difference significant? We conduct the following test of hypothesis. H 0: β i = 0 H 1: β i ≠ 0 Reject H 0 if t > t /2, n-k-1 or t < -t /2, n-k-1

Conclusion: The regression coefficient is not zero. The independent variable garage should be included

Conclusion: The regression coefficient is not zero. The independent variable garage should be included in the analysis. 58

End of Chapter 14 59

End of Chapter 14 59