Bayes Nets Rong Jin Hidden Markov Model o
Bayes Nets Rong Jin
Hidden Markov Model o o O 0 O 1 O 2 O 3 O 4 q 0 q 1 q 2 q 3 q 4 Inferring from observations (oi) to hidden variables (qi) This is a general framework for representing and reasoning about uncertainty n n n Representing uncertain information with random variables (nodes) Representing the relationship between information with conditional probability distribution (directed arcs) Infer from observation (shadowed nodes) to the hidden variables (circled nodes)
An Example of Bayes Network o o o S: It is sunny L: Ali arrives slightly late O: Slides are put on web late
Absence of an arrow: Bayes Network Example Random S and O are independent. Knowing S will not help predicate O S O L Two arrows into L: L depends on S and O. Knowing S and O will help predicate L.
Inference in Bayes Network o o S = 1, O = 0, P(L) = ? S = 1, P(O) = ? , P(L) = ? L = 1, P(S) = ? , P(O) = ? L = 1, S = 1, P(O) = ? S O L
Conditional Independence Formal definition: A and B are conditional independent given C iff o o Different from independence o C A B o Example: o A: shoe size o B: glove size o C: heigh Shoe size is not independent from glove size
Distinguish Two Cases o C o B A o A: shoe size B: glove size C: heigh Given C: A and B are independent Without C: A and B can be dependent S O o o Without L: S and O are independent Given L: S and O can be dependent L o S: It is sunny L: Ali arrives slightly late O: Slides are put on web late
Another Example for Bayes Nets Cloudy Rain Sprinkle Wet. Grass Inference questions o W=1, P(R) =? o W= 1, P(C) = ? o W= 1, C = 1, P(S) = ? , P(C) = ? , P(S, R) = ?
Bayes Nets Formalized A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V , E where: n V is a set of vertices. n E is a set of directed edges joining vertices. No loops of any length are allowed. Each vertex in V contains the following information: n The name of a random variable n A probability distribution table indicating how the probability of this variable’s values depends on all possible combinations of parental values.
Building a Bayes Net 1. 2. 3. 4. Choose a set of relevant variables. Choose an ordering for them Assume they’re called X 1. . Xm (where X 1 is the first in the ordering, X 1 is the second, etc) For i = 1 to m: 1. Add the Xi node to the network 2. Set Parents(Xi ) to be a minimal subset of {X 1…Xi-1} such that we have conditional independence of Xi and all other members of {X 1…Xi-1} given Parents(Xi ) 3. Define the probability table of P(Xi =k Assignments of Parents(Xi ) ).
Example of Building Bayes Nets Suppose we’re building a nuclear power station. There are the following random variables: GRL : Gauge Reads Low. CTL : Core temperature is low. FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds o o If alarm working properly, the alarm is meant to sound if the gauge stops reading a low temp. If gauge working properly, the gauge is meant to read the temp of the core.
Bayes Net for Power Station GRL : Gauge Reads Low. CTL : Core temperature is low. FG GRL FA FG : Gauge is faulty. FA : Alarm is faulty AS : Alarm sounds AS
Inference with Bayes Nets o Key issue: computing joint probability P(X 1=x 1 ^ X 2=x 2 ^ …. Xn-1=xn-1 ^ Xn=xn) o Using the conditional independence relations to simplify the computation
Example for Inference Cloudy Rain Sprinkle Wet. Grass Inference questions o W=1, P(R) =? o W= 1, P(C) = ? o W= 1, C = 1, P(S) = ? , P(C) = ? , P(S, R) = ?
Problem with Inference using Bayes Nets o Inference n Infer from observations EO to unknown variables Eu Suppose you have m binary-valued variables in your Bayes Net and expression Eo mentions k variables. How much work is the above computation?
Problem with Inference using Bayes Nets o o General querying of Bayes nets is NP-complete. Some solutions: n Belief propagation o n Take advantage of the structure of Bayes nets Stochastic simulation o Similar to the sampling approaches for Bayesian average
More Interesting Questions o Learning Bayes nets n Given the topological structure of a Bayes net, learn all the conditional probability tables from examples o n Example: Hierarchical mixture model Learning the topological structure of Bayes net o o Very very hard question Unfortunately, the lecturer does not have enough knowledge to teach you if he wants to !
Learning Cond. Probabilities in Bayes Nets Cloudy Three types of training examples o 1. 2. Rain Sprinkle 3. Maximum likelihood approach for estimating the conditional probabilities n Wet. Grass C, S, R, W C, R, W S, C, W n EM algorithm for optimization
- Slides: 18