Constrained ordination Regression is the key to understanding

  • Slides: 16
Download presentation
Constrained ordination Regression is the key to understanding

Constrained ordination Regression is the key to understanding

Studying community means studying individual species and comparing them Moisture Manure (nutrients)

Studying community means studying individual species and comparing them Moisture Manure (nutrients)

Linear Regression: the model Y = b 0 + b 1*X + e

Linear Regression: the model Y = b 0 + b 1*X + e

Linear Regression: the quality • Total sum of squares (TSS): • Model sum of

Linear Regression: the quality • Total sum of squares (TSS): • Model sum of squares (MSS): • Residual sum of squares (RSS):

Multiple multiple responses, predictors

Multiple multiple responses, predictors

The best predictors ever: principal components

The best predictors ever: principal components

Comparing two regressions with first two PCA axes PCA: l 1 + l 2

Comparing two regressions with first two PCA axes PCA: l 1 + l 2 = 0. 51

PCA ordination diagram

PCA ordination diagram

What to do with measured environmental factors? - I

What to do with measured environmental factors? - I

What to do with measured environmental factors? - II • Predicting species values using

What to do with measured environmental factors? - II • Predicting species values using PCA 1 and PCA 2: yik = b 1 k * PCA 1 i + b 2 k * PCA 2 i + e • Constraining scores definition PCA-> RDA 1 i = c 11*Moisturei + c 12*Manurei • Similarly: RDA 2 i = c 21*Moisturei + c 22*Manurei • Consequently: yik = b 1 k*c 11*Moisturei+b 1 k*c 12*Manurei+ b 2 k*c 21*Moisturei+b 2 k*c 22*Manurei

The boiled-down predictors: constrained axes

The boiled-down predictors: constrained axes

Definition of constrained ordination axes

Definition of constrained ordination axes

Comparing regressions, PCA axes, and RDA axes RDA: l 1 + l 2 =

Comparing regressions, PCA axes, and RDA axes RDA: l 1 + l 2 = 0. 37

RDA: alternative interpretation

RDA: alternative interpretation

When linearity is not a good idea • Weighted regression on proportional data leads

When linearity is not a good idea • Weighted regression on proportional data leads to weighted averaging approach: yik (yik/ y+k)/(yi+/ y++) case weights are yi+ , variable weights are y+k • Roughly: Ø resulting gradients are best predictors for unimodal response model Ø species scores represent optima

Species scores vs. optima

Species scores vs. optima