Introduction to Regression Analysis Two Purposes Explanation Explain

  • Slides: 45
Download presentation
Introduction to Regression Analysis

Introduction to Regression Analysis

Two Purposes • Explanation – Explain (or account for) the variance in a variable

Two Purposes • Explanation – Explain (or account for) the variance in a variable (e. g. , explain why children’s test scores vary). – We’ll cover this later. • Prediction – Construction an equation to predict scores on some variable. – Construct an equation that can be used in selecting individuals.

Prediction • Use a set of scores collected from a sample to make predictions

Prediction • Use a set of scores collected from a sample to make predictions about individuals in the population (not in the sample). • Use the scores to construct a mathematical (typically linear) equation that allows us to predict performance. • Two types of scores are collected: – Usually a measure on one criterion (outcome, dependent) variable. – Scores on one or more predictor (independent) variables.

The equations The equation for one individual’s criterion score: The prediction equation for that

The equations The equation for one individual’s criterion score: The prediction equation for that individual’s score The difference between the two equations (called a residual)

The function The linear function has the form: Where the βs are weights (regression

The function The linear function has the form: Where the βs are weights (regression weights) selected such that sum of squared errors are minimized (least squares criterion)

Multiple Correlation Minimizing the sum of squared errors causes the correlation between the actual

Multiple Correlation Minimizing the sum of squared errors causes the correlation between the actual criterion scores and the predicted scores to be maximized (as large as possible). This correlation is called a multiple correlation. It is the correlation between the criterion variable and a linear composite of the predictor variables.

Coefficient of Determinatin The square of the multiple correlation, is called the coefficient of

Coefficient of Determinatin The square of the multiple correlation, is called the coefficient of determination. It gives the proportion of shared variance (i. e. , covariance) between the criterion variable and the weighted linear composite. Hence the larger the, R 2, the better the prediction equation.

Basic regression equation

Basic regression equation

Computing the constants in the regression equation where

Computing the constants in the regression equation where

A closer look at the regression equation

A closer look at the regression equation

SSy is given by Partitioning the Sum of Squares (SSy) Now, Consider the following

SSy is given by Partitioning the Sum of Squares (SSy) Now, Consider the following identity Subtracting from each side gives, Squaring and summing gives,

Simplifying the previous equation Where SSreg = Sum of squares due to regression, and

Simplifying the previous equation Where SSreg = Sum of squares due to regression, and SSres = Residual sum of squares. Dividing through by the total sum of squares, gives: , or

Example Y X 3 1 0 4 5 1 0 1 -1 2

Example Y X 3 1 0 4 5 1 0 1 -1 2

Calculation of squares and cross-products Deviation squares and crossproducts Sums of squares and crossproducts

Calculation of squares and cross-products Deviation squares and crossproducts Sums of squares and crossproducts

Calculation of the coefficients The slope, the intercept, and the regression line.

Calculation of the coefficients The slope, the intercept, and the regression line.

Calculation of SSreg From an earlier equation…

Calculation of SSreg From an earlier equation…

Some additional equations for SSreg Hence…

Some additional equations for SSreg Hence…

SSreg computed from a correlation The formula for the Pearson correlation is… therefore,

SSreg computed from a correlation The formula for the Pearson correlation is… therefore,

A Closer Look at the Equations in Regression Analysis

A Closer Look at the Equations in Regression Analysis

The Variance

The Variance

The standard deviation

The standard deviation

The covariance

The covariance

The Pearson product moment correlation

The Pearson product moment correlation

The normal equations (for the regressions of y on x)

The normal equations (for the regressions of y on x)

The structural model (for an observation on individual i)

The structural model (for an observation on individual i)

The regression equation

The regression equation

Partitioning a deviation score, y

Partitioning a deviation score, y

The score, Y, is partitioned Hence, Y is partitioned into a deviation of a

The score, Y, is partitioned Hence, Y is partitioned into a deviation of a predicted score from the mean or the scores PLUS a deviation of the actual score from the predicted score. Our next step is to square the deviation, and sum over all the scores.

Partitioning the sum of squared deviations (sum of squares, SSy)

Partitioning the sum of squared deviations (sum of squares, SSy)

What happened to the term, Showing that reduces to zero requires some complicated algebra,

What happened to the term, Showing that reduces to zero requires some complicated algebra, recalling that and that

Calculation of proportions of sums of squares due to regression and due to error

Calculation of proportions of sums of squares due to regression and due to error (or residual)

Alternative formulas for computing the sums of squares due to regression

Alternative formulas for computing the sums of squares due to regression

Test of the regression coefficient, byx, (i. e. test the null hypothesis that byx

Test of the regression coefficient, byx, (i. e. test the null hypothesis that byx = 0) First compute the variance of estimate

Test of the regression coefficient, byx, (i. e. test the null hypothesis that byx

Test of the regression coefficient, byx, (i. e. test the null hypothesis that byx = 0) Then obtain the standard error of estimate Then compute the standard error of the regression coefficient, Sb

The test of significance of the regression coefficient (byx) The significance of the regression

The test of significance of the regression coefficient (byx) The significance of the regression coefficient is tested using a t test with (N-k-1) degrees of freedom:

Computing regression using correlations The correlation, in the population, is given by The population

Computing regression using correlations The correlation, in the population, is given by The population correlation coefficient, ρxy, is estimated by the sample correlation coefficient, rxy

Sums of squares, regression (SSreg) Recalling that R 2 gives the proportion of variance

Sums of squares, regression (SSreg) Recalling that R 2 gives the proportion of variance of Y accounted for (or explained) by X, we can obtain or, in other words, SSreg is that portion of SSy predicted or explained by the regression of Y on X.

Standard error of estimate From SSres we can compute the variance of estimate and

Standard error of estimate From SSres we can compute the variance of estimate and standard error of estimate as (Note alternative formulas were given earlier. )

Testing the Significance of r The significance of a correlation coefficient, r, is tested

Testing the Significance of r The significance of a correlation coefficient, r, is tested using a t test: With N-2 degrees of freedom.

Testing the difference between two correlations To test the difference between two Pearson correlation

Testing the difference between two correlations To test the difference between two Pearson correlation coefficients, use the “Comparing two correlation coefficients” calculator on my web site.

Testing the difference between two regression coefficients This, also, is a t test: Where

Testing the difference between two regression coefficients This, also, is a t test: Where was given earlier.

Point-biserial and Phi correlation These are both Pearson Product-moment correlations The Point-biserial correlation is

Point-biserial and Phi correlation These are both Pearson Product-moment correlations The Point-biserial correlation is used when on variable is a scale variable and the other represents a true dichotomy. For instance, the correlation between a performance on an item—the dichotomous variable—and the total score on a test—the scaled variable.

Point-biserial and Phi correlation The Phi correlation is used when both variables represent a

Point-biserial and Phi correlation The Phi correlation is used when both variables represent a true dichotomy. For instance, the correlation between two test items.

Biserial and Tetrachoric correlation These are non-Pearson correlations. Both are rarely used anymore. The

Biserial and Tetrachoric correlation These are non-Pearson correlations. Both are rarely used anymore. The biserial correlation is used when one variable is truly a scaled variable and the other represents an artificial dichotomy. The Tetrachoric correlation is used when both variables represent an artificial dichotomy.

Spearman’s Rho Coefficient and Kendall’s Tau Coefficient Spearman’s rho is used to compute the

Spearman’s Rho Coefficient and Kendall’s Tau Coefficient Spearman’s rho is used to compute the correlation between two ordinal (or ranked) variables. It is the correlation between two sets of ranks.