# Review of Probability Random Process Random Field for

- Slides: 20

Review of Probability, Random Process, Random Field for Image Processing © 2002 -2003 by Yu Hen Hu ECE 533 Digital Image Processing 1

Probability Models l Experiment: » Throw a dice, toss a coin, … l Example : Card draw » Each experiment has an outcome » A single card is drawn from a » Experiments can be repeated. well shuffled deck of playing l Sample space ( ) » The set of all possible outcomes of an experiments l Event » A subset of outcomes in that has a particular meanings. l Probability of an event A » P(A) = |A|/| | » |A|: cardinality of A, # of elements in set A. © 2002 -2003 by Yu Hen Hu cards. Find P(drawing an Ace) » = {1, 2, …, 52}, | | = 52. » Event A = drawing an Ace. Assume the four Aces cards are labeled as 1, 2, 3, 4, then event A = {1, 2, 3, 4}, |A| = 4 » Thus, P(A) = 4/52 = 1/13 ECE 533 Digital Image Processing 2

Axioms of a Probability Model l Each outcome i of an experiment can be assigned to a probability measure P( i) such that 0 P( i) 1. For fair experiments where each outcome is equally likely to occur, P( i) = 1/| | In general, the probability of an event, which is a set of outcomes is evaluated as: Given a set A, its corresponding probability measure P(A) has the following properties: 1. P( ) = 0. • 3. P(A) 0 for every event A. If Am An = for m n, then 4. P( ) = 1. 2. • © 2002 -2003 by Yu Hen Hu The empty set is an impossible event. The probability of entire sample space = unity. ECE 533 Digital Image Processing 3

Independence Question: Given a fair coin, and a wellshuffled deck of cards. What is the probability of toss the coin and observe a Head AND drawing a Jack of hearts? Answer: P(Head) = ½ P(Jack of hearts) = 1/52. But the events of tossing a coin and drawing a card are independent!. Hence P(Head AND Jack of hearts) = P(Head) P(Jack of hearts) = 1/104. © 2002 -2003 by Yu Hen Hu Independence Two events A and B are statistically independent if P(A B) = P(A)P(B) Independence of N events Given N events {An; 1 n N}. We say these N events are mutually independent iff where J {1, 2, …, N} is any subset of the indices ECE 533 Digital Image Processing 4

Conditional Probability Let A and B be two events in the sample space . Given that B has occurred, the conditional probability that A will also occur is defined as: Example A perfect dice is thrown twice. Given that the sum of the two outcomes is 9. What is the probability that the outcome of the first throw is 4? Answer Assuming P(B) 0. Theorem. If A and B are independent events, then P(A|B) = P(A)P(B)/P(B) = P(A) © 2002 -2003 by Yu Hen Hu Let the outcome of the first throw is m, the second throw is n. Then B={(m, n); m+n=9, 1 m, n 6} ={(3, 6), (4, 5), (5, 4), (6, 3)} A B={(m, n); m=4, n=5, 1 m, n 6} ={(4, 5)} P(A|B) = P(A B)/P(B) = (1/36)/(4/36) = ¼ Note that P(A) = 1/6. ECE 533 Digital Image Processing 5

Law of Total Probability & Bayes’ Rule Law of total probability Bayes’ Rule Let {Bn} be a set of events that partitions the sample space : Then for any event A , Thus, © 2002 -2003 by Yu Hen Hu ECE 533 Digital Image Processing 6

Random Variable l l l A random variable X( ) is a real-valued function defined for points in a sample space Example: If is the whole class of students, and is an individual student, we may define X( ) as the height of an individual student (in feet) Question: What is the probability of a student’s height between 5 feet and 6 feet? © 2002 -2003 by Yu Hen Hu l l Define B=[5, 6]. Our goal is to find the probability P({ : 5 X( ) 6}) = P({ : X( ) B}) = P({X B}) In general, we are interested in the probability P({X B}) or for convenience, P(X B). » If B = {xo} is a singleton set, we may simply write P(X=xo}. l l Example 2. 1 P(a X b) = P(X b) P(X a) Example 2. 2 P(X=0 or X = 1) = P(X=0) + P(X=1) ECE 533 Digital Image Processing 7

Probability Mass Functions (PMF) and Expectations l l Probability mass function (PMF) is defined on a discrete random variable X by p. X(xi) = P(X = xi) Hence, l Marginal PMF: l Expectations (mean, average): Joint PMF of X and Y: © 2002 -2003 by Yu Hen Hu ECE 533 Digital Image Processing 8

Moments and Standard Deviation n-th moment: E[Xn] Defined over a real-valued random variable X. Standard Deviation: var(X) Let m = E[X], then Var[X] = E[(X-m)2] = E[X 2 – 2 Xm + m 2] = E[X 2] – 2 m. E[X] + m 2 = E[X 2] – (E[X])2 © 2002 -2003 by Yu Hen Hu Example Find the E[X 2] and var(X) of a Bernoulli r. v. X: E[X 2] = 02 (1 – p) + 12 p = p Since E[X] = p, thus, Var(X) = E[X 2] – (E[X])2 = p – (p)2 = p(1 – p) Example Let X ~ poisson( ). Since E[X(X – 1)] = 2, we have E[X 2] = 2 + . Thus, var(X) = ( 2 + ) – 2 = . ECE 533 Digital Image Processing 9

Conditional Probability The conditional probability is defined as follows: Example Let X = message to be sent (an integer). For X = i, light intensity i is directed at a photo detector. Y ~ Poisson( i) = # of photoelectrons generated at the detector. Solution: for n = 0, 1, 2, … In terms of pmf, we have Thus, P(Y<2|X=i) = P(Y=0|X=i) + P(Y=1|X=i) = © 2002 -2003 by Yu Hen Hu ECE 533 Digital Image Processing 10

Definitions of Continuous R. V. s Definition: Continuous R. V. Let X( ) be a random variable defined on a sample space . X is a continuous random variable if © 2002 -2003 by Yu Hen Hu Definition: probability density function (pdf) f(x) is a probability density function if ECE 533 Digital Image Processing 11

Cumulative Distribution Function Definition: The cumulative distribution function (cdf) of a random variable X is defined by FX(x) = P(X x) (d) (e) (f) F(x) is right continuous. I. e. If X is a continuous random var. (g) (h) Properties of CDFs (a) 0 F(x) 1. (b) F(a) = P(a X b) (c) a < b implies F(a) < F(b) © 2002 -2003 by Yu Hen Hu P(X=x 0) = F(x 0) – F(x 0 ) Note that if F(x) is continuous at x 0, F(x 0+) = F(x 0). From (h), P(X=x 0) = 0! ECE 533 Digital Image Processing 12

Functions of Random Variables Let X be a random variable, and g(X) a real-valued function. Y=g(X) is a new random variable. We want to find P(Y C) in terms of FX(x). For this, we must find the set B such that To find FY(y), C = (- , y], or equivalently, © 2002 -2003 by Yu Hen Hu Example X: input voltage, a random variable. Y = g(X) = a. X + b where a 0 is the gain, and b is offset voltage. Solution: g(x) y iff x (y-b)/a for a > 0, and x (y-b)/a for a < 0. a>0: FY(y) = FX((y-b)/a), f. Y(y) = d. F/dy = (1/a)f. X((y-b)/a) a<0: FY(y) = 1 -FX((y-b)/a), f. Y(y) = d. F/dy = (-1/a)f. X((y-b)/a) In summary, f. Y(y) = (1/|a|)f. X((y-b)/a) ECE 533 Digital Image Processing 13

Random Processes and Random Fields Random Process : A family of random variables Xt( ) For each fixed outcome , Xt( ) is a function of t (time). For fixed index t, Xt( ) is a random variable defined on . Example a. A jukebox has 6 songs. You roll a dice and based on its outcome to pick a song. Example b. Let t {0, 1, 2, …}. At each t, toss a coin. Xt( ) = 0 if outcome is tail, = 1 if outcome is head. © 2002 -2003 by Yu Hen Hu Random Field: A random field is a random process that is defined on 2 D space rather than on 1 D (time). For a monochrome image, the intensity of each pixel f(x, y) = Xx, y( ) is modeled as a random variable. For a particular outcome i, f(x, y) is a deterministic function of x, and y. All results applicable to random processes can be applied to random field. ECE 533 Digital Image Processing 14

Mean, Correlation and Covariance Mean If Xt is a random process, it mean function is where the expectation is taken w. r. t. pmf or pdf of Xt at time t. Correlation Covariance © 2002 -2003 by Yu Hen Hu Example a. Denote si(t) to be the time function of ith song. Then Example d. Given that X 0 = 5. Hence P(X 1 = 4) = 1, m. X(1) = 4. P(X 2 = 3) = (4/5) = 16/25, P(X 2 = 4) = (4/5)(1/5) + (1/5)(4/5) = 8/25, P(X 2 = 5) = 1/25. Thus, m. X(2) = 3(16/25)+4(8/25)+5(1/25) = 17/5 ECE 533 Digital Image Processing 15

Stationary Process and WSS Any property that depends on the value of {Xt} at k index points t 1, t 2, …, tk is completely characterized by the joint pdf (or pmf) of Xt 1, Xt 2, …, Xtk denoted by (pdf case) f(X(t 1), …, X(tk)) Definition Stationary Process {Xt} is (strictly) stationary if for any finite set of time points {t 1, t 2, …, tk}, their joint pdf is time invariant. f(X(t 1), …, X(tk)) = f(X(t 1+ ), …, X(tk + )) © 2002 -2003 by Yu Hen Hu Definition Wide-sense stationary {Xt} is wide-sense stationary if its first two moments are independent of time. That is, m. X(t) = E[Xt] = m. X RX(u, v) = RX(u v) Let u = t + , v = t, we may write RX(u, v) = RX((t + )-t) = RX( ) ECE 533 Digital Image Processing 16

Power Spectral Density and Power Definition Power Spectral Density SX(f) gives the power density of the process Xt distributed over each frequency f. Hence it must be non-negative. Definition Power PX © 2002 -2003 by Yu Hen Hu Properties of PSF and Correlation function: a) R( ) = R( ). Hence SX(f) is a real valued, even function. b) R( ) R(0). To prove, use Cauchy-Schwarz inequality: E[UV]2 E[U 2]E[V 2] c) SX(f) is real, even and nonnegative. ECE 533 Digital Image Processing 17

LTI System: A brief review l l A system y(t) = L[x(t)] is a mapping of a function x(t) to a function y(t). L[ ] is a Linear System iff L[ax 1+bx 2] = a L[x 1]+b L[x 2] L[ ] is time invariant iff L[x(t+u)] = y(t+u) A LTI (linear, time invariant) system can be uniquely characterized by its impulse response h(t) = L[ (t)] © 2002 -2003 by Yu Hen Hu Given a LTI system y(t) = L[x(t)], y(t) can be obtained via the convolution between x(t) and the impulse response h(t): The Fourier transform of h(t) is called the transfer function ECE 533 Digital Image Processing 18

LTI System with WSS input l Let a WSS process Xt be the input to a LTI system with impulse response h(t). The output is denoted by Yt. l If E[Xt] = m, then Cross correlation between Xt, Yt Define © 2002 -2003 by Yu Hen Hu ECE 533 Digital Image Processing 19

LSS Input to a LTI System (Cont’d) Define cross PDF as the Fourier transform of RXY( ), then SXY(f) = H*(f)SX(f) Therefore, © 2002 -2003 by Yu Hen Hu Substitute t-s with , we have Taking Fourier transform, ECE 533 Digital Image Processing 20

- Binomial Probability Probability Simple Theoretical Probability Experimental Probability
- Probability Objective Probability Prior Probability Posterior Probability Conditional
- Probability Random Variables Probability Random Variables n Randomness
- Random Variables and Probability Distributions Random Variables Random
- Random Variables and Probability Distributions Random Variables Random