Probability and Statistical Inference 9 th Edition Chapter
Probability and Statistical Inference (9 th Edition) Chapter 5 (Part 1/2) Distributions of Functions of Random Variables November 18, 2015 1
Outline 5. 1 Functions of One Random Variable 5. 2 Transformation of Two Random Variables 5. 3 Several Random Variables 5. 4 The Moment-Generating Function Technique 2
Functions of One Random Variable Let X be a continuous random variable with pdf f(x). If we consider a function of X, say, Y=u(X), then Y must also be a random variable with its own distribution l The cdf of Y is G(y) = P(Y<=y) = P(u(X)<=y) l The pdf of Y is g(y) = G’(y) (where apostrophe ’ denotes derivative) l 3
Functions of One Random Variable Change-of-variable technique l Let X be a continuous random variable with pdf f(x) with support c 1<x<c 2. We begin this discussion by taking Y=u(X) as a continuous increasing function of X with inverse function X=v(Y). Say the support of X maps onto the support of Y, d 1=u(c 1)<y<d 2=u(c 2). Then, the cdf of Y is l Thus, 4
Functions of One Random Variable l The derivative, g(y)=G’(y), of such an expression is given by l Suppose now the function Y=u(X) and its inverse X=v(Y) are continuous decreasing functions. Then, Thus, 5
Functions of One Random Variable l Thus, for both increasing and decreasing cases, we can write the pdf of Y: 6
Functions of One Random Variable l Example 1 7
Functions of One Random Variable Theorem 1: Suppose that a random variable X has a continuous distribution for which the cdf is F. Then, the random variable Y=F(X) has a uniform distribution l Random variables from any given continuous distribution can be converted to random variables having a uniform distribution, and vice versa l 8
Functions of One Random Variable cdf F of N(0, 1) Y = F(X) ~ U(0, 1) pdf of N(0, 1) X ~ N(0, 1) 9
Functions of One Random Variable Theorem 1 (converse statement): If U is a uniform random variable on (0, 1), then random variable X=F-1(U) has cdf F (where F is a continuous cdf and F-1 is its inverse function) l Proof: P(X<=x) = P(F-1(U)<=x) = P(U<=F(x)) = F(x) l 10
Functions of One Random Variable Theorem 1 (converse statement) can be used to generate random variables of any distribution l To generate values of X which are distributed according to the cdf F: 1. Generate a random number u from U (uniform random variable on (0, 1)) 2. Compute the value x such that F(x) = u 3. Take x to be the random number distributed according to the cdf F l 11
Functions of One Random Variable l Example 2 (the transformation Y=u(X) is not one-toone): Let Y=X 2, where X is Cauchy, then where Thus, In this case of two-to-one transformation, there is a need to sum two terms, each of which is similar to the one-to-one case 12
Functions of One Random Variable Consider the discrete case l Let X be a discrete random variable with pmf f(x)=P(X=x). Let Y=u(X) be a one-to-one transformation with inverse X=v(Y). Then, the pmf of Y is l l Note that, in the discrete case, the derivative |v’(y)| is not needed 13
Functions of One Random Variable l Example 3: Let X be a uniform random variable on {1, 2, …, n}. Then Y=X+a is a uniform random variable on {a+1, a+2, …, a+n} 14
Transformations of Two Random Variables l If X 1 and X 2 are two continuous random variables with joint pdf f(x 1, x 2), and if Y 1=u 1(X 1, X 2), Y 2=u 2(X 1, X 2) has the single-valued inverse X 1=v 1(Y 1, Y 2), X 2=v 2(Y 1, Y 2), then the joint pdf of Y 1 and Y 2 is where | | denotes absolute value and J is the Jacobian given by where | | denotes the determinant of a matrix 15
Transformations of Two Random Variables l Example 1: Let X 1 and X 2 be independent random variables, each with pdf Hence, their joint pdf is Let Y 1=X 1 -X 2, Y 2=X 1+X 2. Thus, x 1=(y 1+y 2)/2, x 2=(y 2 -y 1)/2, and the Jacobian 16
Transformations of Two Random Variables l Then, the joint pdf of Y 1 and Y 2 is 17
Transformations of Two Random Variables l Example 2 (Box-Muller Transformation): Let X 1 and X 2 be i. i. d. U(0, 1). Let Thus, where Q=Z 12+Z 22, and the Jacobian 18
Transformations of Two Random Variables l Since the joint pdf of X 1 and X 2 is it follows that the joint pdf of Z 1 and Z 2 is This is the joint pdf of two i. i. d. N(0, 1) random variables l Hence, we can generate two i. i. d. N(0, 1) random variables from two i. i. d. U(0, 1) random variables using this method l 19
Random Samples l Assume that we conduct an experiment n times independently. Let Xk be the random variable corresponding to the outcome of the k-th run of experiment. Then X 1, X 2, …, Xn form a set of random samples of size n 20
Random Samples l For example, if we toss a die n times and let X 1, X 2, …, Xn be the random variables corresponding to the outcome of the k-th tossing. Then X 1, X 2, …, Xn form a set of random samples of size n 21
Theorems about Independent Random Variables l Let X 1, X 2, …, Xn be n independent discrete random variables and h() be a function of n variables. Then, the expected value of random variable Z=h(X 1, X 2, …, Xn) is equal to 22
Theorems about Independent Random Variables l Likewise, if X 1, X 2, …, Xn are independent continuous random variables, then 23
Theorems about Independent Random Variables l Theorem: If X 1, X 2, …, Xn are independent random variables and, for i = 1, 2, …, n, E[hi(Xi)] exists, then E[h 1(X 1) h 2(X 2) … hn(Xn)] = E[h 1(X 1)] E[h 2(X 2)]… E[hn(Xn)] 24
Theorems about Independent Random Variables l Proof for the discrete cases: l The proof for the continuous cases can be derived similarly 25
Theorems about Independent Random Variables l Theorem: Assume that X 1, X 2, …, Xn are n independent random variables with respective means μ 1, μ 2, …, μn and variances σ12, σ22, …, σn 2. Then, the mean and variance of random variable where a 1, a 2, …, an are real constants, are and 26
Theorems about Independent Random Variables l Proof: 27
Theorems about Independent Random Variables l Since Xi and Xj are independent where i≠j, l Therefore, 28
Moment-Generating Function Technique l Let X be a random variable. The momentgenerating function (mgf) of X is defined as l It is called mgf because all of the moments of X can be obtained by successively differentiating MX(t) 29
Moment-Generating Function Technique l For example, Thus, l Similarly, Thus, 30
Moment-Generating Function Technique l In general, the nth derivative of MX(t) evaluated at t=0 equals E[Xn], i. e. , where denotes the nth derivative of 31
Moment-Generating Function Technique l Moment generating function uniquely determines the distribution. That is, there exists a one-to-one correspondence between the moment generating function (mgf) and the distribution function (pmf/pdf) of a random variable 32
Moment-Generating Function Technique l Example 1 (mgf of N(0, 1)): where the last equality follows from the fact that the expression in the integral is the pdf of a normal random variable with mean t and variance 1 which integrates to one 33
Moment-Generating Function Technique l Exercise (mgf of N(m, s 2)): 34
Moment-Generating Function Technique l Theorem: If X 1, X 2, …, Xn are independent random variables with respective mgfs, Mi(t), i=1, 2, …, n, then the mgf of is 35
Moment-Generating Function Technique l Proof: 36
Moment-Generating Function Technique l Corollary: If X 1, X 2, …, Xn correspond to independent random samples from a distribution with mgf M(t), then 37
Moment-Generating Function Technique l The mgf of the sum of independent random variables is just the product of the individual mgfs 38
Moment-Generating Function Technique l Example 2: Recall that, let Z 1, Z 2, …, Zn be independent N(0, 1). Then, W=Z 12+Z 22+…+Zn 2 has a distribution that is chi-square with n degrees of freedom, denoted by Let X 1, X 2, …, Xn be independent chi-square random variables with r 1, r 2, …, rn degrees of freedom, respectively. Show that Y=X 1+X 2+…+Xn is 39
Moment-Generating Function Technique l Use the moment-generating function technique: which is the mgf of a Thus, Y is 40
- Slides: 40