Instructor Shengyu Zhang Statistical inference n Statistical inference

  • Slides: 72
Download presentation
Instructor: Shengyu Zhang

Instructor: Shengyu Zhang

Statistical inference n Statistical inference is the process of extracting information about an unknown

Statistical inference n Statistical inference is the process of extracting information about an unknown variable or an unknown model from available data. n Two main approaches q q Bayesian statistical inference Classical statistical inference

Statistical inference n Main categories of inference problems q q q parameter estimation hypothesis

Statistical inference n Main categories of inference problems q q q parameter estimation hypothesis testing significance testing

Statistical inference n Most important methodologies q q q maximum a posteriori (MAP) probability

Statistical inference n Most important methodologies q q q maximum a posteriori (MAP) probability rule, least mean squares estimation, maximum likelihood, regression, likelihood ratio tests

Bayesian versus Classical Statistics n Two prominent schools of thought q q n n

Bayesian versus Classical Statistics n Two prominent schools of thought q q n n n Bayesian Classical/frequentist. Difference: What’s the nature of the unknown models or variables? Bayesian: they are treated as random variables with known distributions. Classical/frequentist: they are treated as deterministic but unknown quantities.

Bayesian n

Bayesian n

Classical/frequentist n

Classical/frequentist n

Model versus Variable Inference n n Model inference: the object of study is a

Model versus Variable Inference n n Model inference: the object of study is a real phenomenon or process, … …for which we wish to construct or validate a model on the basis of available data q n e. g. , do planets follow elliptical trajectories? Such a model can then be used to make predictions about the future, or to infer some hidden underlying causes.

Model versus Variable Inference n Variable inference: we wish to estimate the value of

Model versus Variable Inference n Variable inference: we wish to estimate the value of one or more unknown variables by using some related, possibly noisy information q e. g. , what is my current position, given a few GPS readings?

Statistical Inference Problems n

Statistical Inference Problems n

Statistical Inference Problems n

Statistical Inference Problems n

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian least mean squares estimation Bayesian linear least mean squares estimation

Bayesian inference n

Bayesian inference n

Bayesian inference n

Bayesian inference n

Bayesian inference n

Bayesian inference n

Summary of Bayesian Inference n

Summary of Bayesian Inference n

Bayes’ rule: summary n

Bayes’ rule: summary n

Example: meeting n

Example: meeting n

n

n

n

n

Example: Inference of common mean of normal n

Example: Inference of common mean of normal n

n

n

Numerator n

Numerator n

n

n

n

n

n

n

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian least mean squares estimation Bayesian linear least mean squares estimation

MAP n

MAP n

n This is called the Maximum a Posteriori probability (MAP) rule.

n This is called the Maximum a Posteriori probability (MAP) rule.

n

n

Computational shortcut n

Computational shortcut n

Example n

Example n

Point Estimation n

Point Estimation n

Two popular estimators n

Two popular estimators n

Example: Romeo and Juliet meeting n

Example: Romeo and Juliet meeting n

n

n

MAP vs. conditional expectation

MAP vs. conditional expectation

Hypothesis testing n

Hypothesis testing n

n

n

Correct probability n

Correct probability n

Example: binary hypothesis testing n

Example: binary hypothesis testing n

n

n

n

n

n

n

n

n

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian least mean squares estimation Bayesian linear least mean squares estimation

Estimation without observation n

Estimation without observation n

proof n

proof n

Estimation with observation n

Estimation with observation n

n

n

n

n

Example n

Example n

n

n

n

n

n

n

Example: meeting n

Example: meeting n

n

n

n

n

n MAP has smaller estimator. n LMS estimator has uniformly smaller mean squared error.

n MAP has smaller estimator. n LMS estimator has uniformly smaller mean squared error.

Properties of estimation error n

Properties of estimation error n

n

n

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian

Content n n Bayesian inference, the posterior distribution Point estimation, hypothesis testing, MAP Bayesian least mean squares estimation Bayesian linear least mean squares estimation

n n LMS estimator is sometimes hard to compute, and we need alternatives. We

n n LMS estimator is sometimes hard to compute, and we need alternatives. We derive an estimator that minimizes the mean squared error within a restricted class of estimators: linear functions of the observations. This estimator may result in higher mean squared error. But it has a significant computational advantage. q It requires simple calculations, involving only means, variances, and covariances of the parameters and observations.

n

n

n

n

n

n

n

n

n

n

Example n

Example n

n

n

n

n