Algebra U at Yt EU c Y at
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t) cov{U, V} = ∑ a(s) b(t) c YY(s-t) U is gaussian if {Y(t)} gaussian
Some useful stochastic models Purely random / white noise (i. i. d. ) (often mean assumed 0) c. YY(u) = cov(Y(t+u), Y(t)} = σY 2 if u = 0 if u ≠ 0 ρYY(u) = 1, u=0 = 0, u ≠ 0 A building block
Random walk Y(t) = Y(t-1) + Z(t), Y(0) = 0 Y(t) = ∑i=1 t Z(i) E{Y(t)} = t μZ var{Y(t)} = t σZ 2 Not stationary, but ∆Y(t) = Y(t) – Y(t-1) = Z(t)
Moving average, MA(q) Y(t) = β(0)Z(t) + β(1)Z(t-1) +…+ β(q)Z(t-q) If E{Z(t)} = 0, E{Y(t)} = 0 c. YY(u) = 0, u > q = σZ 2 ∑ t=0 q-k β(t) β(t+u) = c. YY(-u) u=0, 1, …, q stationary MA(1). ρYY(u) = 1 u = 0 = β(1)/(1+ β(1) 2), = 0 otherwise k = ± 1
Backward shift operator remember translation operator Tu. Y(t)=Y(t+u) Bj. Y(t) = Y(t-j) Linear process. Need convergence condition, e. g. | i | or | i |2 <
autoregressive process, AR(p) first-order, AR(1) Markov (**) Linear process invertible For convergence in probability/stationarity
a. c. f. of ar(1) from previous slide (**) ρYY p. a. c. f. using normal or linear definitions corr{Y(t), Y(t-m)|Y(t-1), . . . , Y(t-m+1)} = 0 for m p when Y is AR(p) Proof. via multiple regression
In general case, Useful for prediction
ρYY Yule-Walker equations for AR(p). Sometimes used for estimation Correlate, with Xt-k , each side of
ARMA(p, q) (B)Yt = (B)Zt
ARIMA(p, d, q). Xt = Xt - Xt-1 arima. mle() 2 Xt = Xt - 2 Xt-1 + Xt-2 fits by mle assuming Gaussian noise
Armax. (B)Yt = β(B)Xt + (B)Zt arima. mle(…, xreg, …) State space. st = Ft(st-1 , zt ) Yt = Ht(st , Zt ) could include X
Next i. i. d. → mixing stationary process Mixing has a variety of definitions e. g. normal case, ∑ |c. YY(u)| < ∞, e. g. Cryer and Chan (2008) CLT m. Y = c. YT = Y-bar = ∑ t=1 T Y(t)/T Normal with E{m. Y} = c. Y var{m. Y} = ∑ s=1 T ∑ t=1 T c YY(s-t) ≈ T ∑ u c YY(u) = T σYY if white noise
Cumulants. cum(Y 1, Y 2, . . . , Yk ) Extends mean, variance, covariance cum(Y) = E{Y} cum(Y, Y) = Var{Y} cum(X, Y) = Cov(X, Y) DRB (1975)
Proof of ordinary CLT. ST = Y(1) + … + Y(T) cumk(ST) = T κ k additivity and imdependence cumk(ST/√T) = T–k/2 cumk(ST) = O( T T–k/2 ) → 0 for k > 2 as T → ∞ normal cumulants of order > 2 are 0 normal is determined by its moments (ST - Tμ)/√ T tends in distribution to N(0, σ2)
Stationary series cumulant functions. cum{Y(t+u 1 ), …, Y(t+u k-1 ), Y(t) } = ck(t+u 1 , … , t+u k-1 , t) = ck(u 1 , . . , uk-1) k = 2, 3, , 4 , … cumulant mixing. ∑ u |ck(u 1 , . . , uk-1)| < ∞ u = (u 1 , . . , uk-1)
- Slides: 18