Conditional Density Functions and Conditional Expected Values Conditional

  • Slides: 22
Download presentation
Conditional Density Functions and Conditional Expected Values

Conditional Density Functions and Conditional Expected Values

§ Conditional probability density functions: useful to update the information about an event based

§ Conditional probability density functions: useful to update the information about an event based on the knowledge about some other related event § We shall analyze the situation where the related event happens to be a random variable that is dependent on the one of interest. § Recall that the distribution function of X given an event B is

§ Suppose, we let § Then § This can be rewritten as

§ Suppose, we let § Then § This can be rewritten as

§ To determine, the limiting case let and § This gives § and hence

§ To determine, the limiting case let and § This gives § and hence in the limit § Thus § and The subscript X | Y is used to remind about the conditional nature

§ We have § and § Thus the formula represents a valid p. d.

§ We have § and § Thus the formula represents a valid p. d. f. § We shall refer to it as the conditional p. d. f of the r. v X given Y = y. We may also write § Thus we have and similarly

§ If the r. vs X and Y are independent, then previous formuls reduces

§ If the r. vs X and Y are independent, then previous formuls reduces to and the § This makes sense, since if X and Y are independent r. vs, information about Y shouldn’t be of any help in updating our knowledge about X. § In the case of discrete-type r. vs, § How to obtain conditional p. d. fs? reduces to

Example § Given § determine Solution § Similarly § and

Example § Given § determine Solution § Similarly § and

Example - continued § Also, § and § From § or § But ,

Example - continued § Also, § and § From § or § But , we also have

Example - continued § So, § This represents the p. d. f version of

Example - continued § So, § This represents the p. d. f version of Bayes’ theorem. § This is very significant in communication problems § In these problems, observations can be used to update our knowledge about unknown parameters.

Example § An unknown random phase is uniformly distributed in § where n §

Example § An unknown random phase is uniformly distributed in § where n § Determine Solution § Initially almost nothing about the r. v is known. § We assume its a-priori p. d. f to be uniform in the interval § In we can think of - n as the noise contribution - r as the observation. § It is reasonable to assume that and n are independent.

Example - continued § In that case Since it is given that is a

Example - continued § In that case Since it is given that is a constant, § This gives the a-posteriori p. d. f of given r : where behaves like n.

Example - continued The knowledge about the observation r is reflected in the a-posteriori

Example - continued The knowledge about the observation r is reflected in the a-posteriori p. d. f of . It is no longer flat as the a-priori p. d. f It shows higher probabilities in the neighborhood of (a) a-priori p. d. f of (b) a-posteriori p. d. f of

Conditional Mean § Defined using conditional p. d. fs § using a limiting argument,

Conditional Mean § Defined using conditional p. d. fs § using a limiting argument, § This is the conditional mean of X given Y = y. § Note that will be a function of y. Also

Conditional Variance § In a similar manner, the conditional variance of X given Y

Conditional Variance § In a similar manner, the conditional variance of X given Y = y is given by

Example § Let § Determine and Solution § § So in the shaded area,

Example § Let § Determine and Solution § § So in the shaded area, and zero elsewhere.

Example - continued § Hence § As generalization of the conditional mean formulas,

Example - continued § Hence § As generalization of the conditional mean formulas,

Example - continued § Also, § Here the inner expectation is with respect to

Example - continued § Also, § Here the inner expectation is with respect to X and the outer expectation is with respect to Y.

Example - continued § Letting g( X ) = X in this equation we

Example - continued § Letting g( X ) = X in this equation we get where the inner expectation on the right side is with respect to outer one is with respect to Y. § Similarly, we have § We can also obtain X and the

Estimation with Conditional Mean § Conditional mean turns out to be an important concept

Estimation with Conditional Mean § Conditional mean turns out to be an important concept in estimation and prediction theory. § What is the best predicted value of Y given that X = x ? § It turns out that - if “best” is meant in the sense of minimizing the mean square error between Y and its estimate , then - the conditional mean of Y given X = x, i. e. , is the best estimate for.

Example: Poisson Sum of Bernoulli Random Variables § Let represent independent, identically distributed Bernoulli

Example: Poisson Sum of Bernoulli Random Variables § Let represent independent, identically distributed Bernoulli random variables with § and N a Poisson random variable with parameter all. that is independent of § Consider the random variables § Show that Y and Z are independent Poisson random variables.

Solution § To determine the joint probability mass function of Y and Z, consider

Solution § To determine the joint probability mass function of Y and Z, consider

Solution - continued § Thus and Y and Z are independent random variables. §

Solution - continued § Thus and Y and Z are independent random variables. § Thus - if a bird lays eggs that follow a Poisson random variable with parameter , - and if each egg surviveswith probability p, § then the number of chicks that survive also forms a Poisson random variable with parameter