Chapter 5 Discrete Random Variables and Probability Distributions

  • Slides: 41
Download presentation
Chapter 5 Discrete Random Variables and Probability Distributions ©

Chapter 5 Discrete Random Variables and Probability Distributions ©

Random Variables A random variable is a variable that takes on numerical values determined

Random Variables A random variable is a variable that takes on numerical values determined by the outcome of a random experiment.

Discrete Random Variables A random variable is discrete if it can take on no

Discrete Random Variables A random variable is discrete if it can take on no more than a countable number of values.

Discrete Random Variables (Examples) 1. 2. 3. 4. The number of defective items in

Discrete Random Variables (Examples) 1. 2. 3. 4. The number of defective items in a sample of twenty items taken from a large shipment. The number of customers arriving at a check-out counter in an hour. The number of errors detected in a corporation’s accounts. The number of claims on a medical insurance policy in a particular year.

Continuous Random Variables A random variable is continuous if it can take any value

Continuous Random Variables A random variable is continuous if it can take any value in an interval.

Continuous Random Variables (Examples) 1. The income in a year for a family. 2.

Continuous Random Variables (Examples) 1. The income in a year for a family. 2. The amount of oil imported into the U. S. in a particular month. 3. The change in the price of a share of IBM common stock in a month. 4. The time that elapses between the installation of a new computer and its failure. 5. The percentage of impurity in a batch of chemicals.

Discrete Probability Distributions The probability distribution function (DPF), P(x), of a discrete random variable

Discrete Probability Distributions The probability distribution function (DPF), P(x), of a discrete random variable expresses the probability that X takes the value x, as a function of x. That is

Discrete Probability Distributions Graph the probability distribution function for the roll of a single

Discrete Probability Distributions Graph the probability distribution function for the roll of a single six-sided die. P(x) 1/6 1 2 3 4 Figure 5. 1 5 6 x

Required Properties of Probability Distribution Functions of Discrete Random Variables Let X be a

Required Properties of Probability Distribution Functions of Discrete Random Variables Let X be a discrete random variable with probability distribution function, P(x). Then i. P(x) 0 for any value of x ii. The individual probabilities sum to 1; that is Where the notation indicates summation over all possible values x.

Cumulative Probability Function The cumulative probability function, F(x 0), of a random variable X

Cumulative Probability Function The cumulative probability function, F(x 0), of a random variable X expresses the probability that X does not exceed the value x 0, as a function of x 0. That is Where the function is evaluated at all values x 0

Derived Relationship Between Probability and Cumulative Probability Function Let X be a random variable

Derived Relationship Between Probability and Cumulative Probability Function Let X be a random variable with probability function P(x) and cumulative probability function F(x 0). Then it can be shown that Where the notation implies that summation is over all possible values x that are less than or equal to x 0.

Derived Properties of Cumulative Probability Functions for Discrete Random Variables Let X be a

Derived Properties of Cumulative Probability Functions for Discrete Random Variables Let X be a discrete random variable with a cumulative probability function, F(x 0). Then we can show that i. 0 F(x 0) 1 for every number x 0 ii. If x 0 and x 1 are two numbers with x 0 < x 1, then F(x 0) F(x 1)

Expected Value The expected value, E(X), of a discrete random variable X is defined

Expected Value The expected value, E(X), of a discrete random variable X is defined Where the notation indicates that summation extends over all possible values x. The expected value of a random variable is called its mean and is denoted x.

Variance and Standard Deviation Let X be a discrete random variable. The expectation of

Variance and Standard Deviation Let X be a discrete random variable. The expectation of the squared discrepancies about the mean, (X - )2, is called the variance, denoted 2 x and is given by The standard deviation, x , is the positive square root of the variance.

Variance (Alternative Formula) The variance of a discrete random variable X can be expressed

Variance (Alternative Formula) The variance of a discrete random variable X can be expressed as

Expected Value and Variance for Discrete Random Variable Using Microsoft Excel (Figure 5. 4)

Expected Value and Variance for Discrete Random Variable Using Microsoft Excel (Figure 5. 4) Expected Value = 1. 95 Variance = 1. 9475

Bernoulli Distribution A Bernoulli distribution arises from a random experiment which can give rise

Bernoulli Distribution A Bernoulli distribution arises from a random experiment which can give rise to just two possible outcomes. These outcomes are usually labeled as either “success” or “failure. ” If denotes the probability of a success and the probability of a failure is (1 - ), the Bernoulli probability function is

Mean and Variance of a Bernoulli Random Variable The mean is: And the variance

Mean and Variance of a Bernoulli Random Variable The mean is: And the variance is:

Sequences of x Successes in n Trials The number of sequences with x successes

Sequences of x Successes in n Trials The number of sequences with x successes in n independent trials is: Where n! = n x (x – 1) x (n – 2) x. . . x 1 and 0! = 1.

Binomial Distribution Suppose that a random experiment can result in two possible mutually exclusive

Binomial Distribution Suppose that a random experiment can result in two possible mutually exclusive and collectively exhaustive outcomes, “success” and “failure, ” and that is the probability of a success resulting in a single trial. If n independent trials are carried out, the distribution of the resulting number of successes “x” is called the binomial distribution. Its probability distribution function for the binomial random variable X = x is: P(x successes in n independent trials)= for x = 0, 1, 2. . . , n

Mean and Variance of a Binomial Probability Distribution Let X be the number of

Mean and Variance of a Binomial Probability Distribution Let X be the number of successes in n independent trials, each with probability of success . The x follows a binomial distribution with mean, and variance,

Binomial Probabilities - An Example – (Example 5. 7) An insurance broker, Shirley Ferguson,

Binomial Probabilities - An Example – (Example 5. 7) An insurance broker, Shirley Ferguson, has five contracts, and she believes that for each contract, the probability of making a sale is 0. 40. What is the probability that she makes at most one sale? P(at most one sale) = P(X 1) = P(X = 0) + P(X = 1) = 0. 078 + 0. 259 = 0. 337

Binomial Probabilities, n = 100, =0. 40 (Figure 5. 10)

Binomial Probabilities, n = 100, =0. 40 (Figure 5. 10)

Hypergeometric Distribution Suppose that a random sample of n objects is chosen from a

Hypergeometric Distribution Suppose that a random sample of n objects is chosen from a group of N objects, S of which are successes. The distribution of the number of X successes in the sample is called the hypergeometric distribution. Its probability function is: Where x can take integer values ranging from the larger of 0 and [n-(N-S)] to the smaller of n and S.

Poisson Probability Distribution 1) 2) 3) Assume that an interval is divided into a

Poisson Probability Distribution 1) 2) 3) Assume that an interval is divided into a very large number of subintervals so that the probability of the occurrence of an event in any subinterval is very small. The assumptions of a Poisson probability distribution are: The probability of an occurrence of an event is constant for all subintervals. There can be no more than one occurrence in each subinterval. Occurrences are independent; that is, the number of occurrences in any non-overlapping intervals in independent of one another.

Poisson Probability Distribution The random variable X is said to follow the Poisson probability

Poisson Probability Distribution The random variable X is said to follow the Poisson probability distribution if it has the probability function: where 1. P(x) = the probability of x successes over a given period of time or space, given 2. = the expected number of successes per time or space unit; > 0 3. e = 2. 71828 (the base for natural logarithms)

Poisson Probability Distribution • The mean and variance of the Poisson probability distribution are:

Poisson Probability Distribution • The mean and variance of the Poisson probability distribution are:

Partial Poisson Probabilities for = 0. 03 Obtained Using Microsoft Excel PHStat (Figure 5.

Partial Poisson Probabilities for = 0. 03 Obtained Using Microsoft Excel PHStat (Figure 5. 14)

Poisson Approximation to the Binomial Distribution Let X be the number of successes resulting

Poisson Approximation to the Binomial Distribution Let X be the number of successes resulting from n independent trials, each with a probability of success, . The distribution of the number of successes X is binomial, with mean n. If the number of trials n is large and n is of only moderate size (preferably n 7), this distribution can be approximated by the Poisson distribution with = n. The probability function of the approximating distribution is then:

Joint Probability Functions Let X and Y be a pair of discrete random variables.

Joint Probability Functions Let X and Y be a pair of discrete random variables. Their joint probability function expresses the probability that X takes the specific value x and simultaneously Y takes the value y, as a function of x and y. The notation used is P(x, y) so,

Joint Probability Functions Let X and Y be a pair of jointly distributed random

Joint Probability Functions Let X and Y be a pair of jointly distributed random variables. In this context the probability function of the random variable X is called its marginal probability function and is obtained by summing the joint probabilities over all possible values; that is, Similarly, the marginal probability function of the random variable Y is

Properties of Joint Probability Functions • Let X and Y be discrete random variables

Properties of Joint Probability Functions • Let X and Y be discrete random variables with joint probability function P(x, y). Then 1. P(x, y) 0 for any pair of values x and y 2. The sum of the joint probabilities P(x, y) over all possible values must be 1.

Conditional Probability Functions Let X and Y be a pair of jointly distributed discrete

Conditional Probability Functions Let X and Y be a pair of jointly distributed discrete random variables. The conditional probability function of the random variable Y, given that the random variable X takes the value x, expresses the probability that Y takes the value y, as a function of y, when the value x is specified for X. This is denoted P(y|x), and so by the definition of conditional probability: Similarly, the conditional probability function of X, given Y = y is:

Independence of Jointly Distributed Random Variables The jointly distributed random variables X and Y

Independence of Jointly Distributed Random Variables The jointly distributed random variables X and Y are said to be independent if and only if their joint probability function is the product of their marginal probability functions, that is, if and only if And k random variables are independent if and only if

Expected Value Function of Jointly Distributed Random Variables Let X and Y be a

Expected Value Function of Jointly Distributed Random Variables Let X and Y be a pair of discrete random variables with joint probability function P(x, y). The expectation of any function g(x, y) of these random variables is defined as:

Stock Returns, Marginal Probability, Mean, Variance (Example 5. 16) Y Return X Return 0%

Stock Returns, Marginal Probability, Mean, Variance (Example 5. 16) Y Return X Return 0% 0% 5% 10% 15% 0. 0625 0. 0625 10% 0. 0625 15% 0. 0625 Table 5. 6

Covariance Let X be a random variable with mean X , and let Y

Covariance Let X be a random variable with mean X , and let Y be a random variable with mean, Y. The expected value of (X X )(Y - Y ) is called the covariance between X and Y, denoted Cov(X, Y). For discrete random variables An equivalent expression is

Correlation Let X and Y be jointly distributed random variables. The correlation between X

Correlation Let X and Y be jointly distributed random variables. The correlation between X and Y is:

Covariance and Statistical Independence If two random variables are statistically independent, the covariance between

Covariance and Statistical Independence If two random variables are statistically independent, the covariance between them is 0. However, the converse is not necessarily true.

Portfolio Analysis The random variable X is the price for stock A and the

Portfolio Analysis The random variable X is the price for stock A and the random variable Y is the price for stock B. The market value, W, for the portfolio is given by the linear function, Where, a, is the number of shares of stock A and, b, is the number of shares of stock B.

Portfolio Analysis The mean value for W is, The variance for W is, or

Portfolio Analysis The mean value for W is, The variance for W is, or using the correlation,