-
A message-passing algorithm for performing inference on a graphical model.
-
Let
be random variables with a joint probability function . The task is to compute the marginal distributions. Define a factor graph with variables
and factors . -
Let
be a variable node and a factor node connected to in the factor graph. Let be the factor corresponding to . -
A message in this case is a re-usable partial sum that can be used in computing the marginal probability
-
A message
is defined as follows Where
is the set of neighboring factor nodes. If is empty, we use the uniform probability distribution. -
A message
is defined as follows Where
is the set of neighboring variable nodes of . If is empty, we have .
-
-
The marginal probabilities can be approximated as follows
-
A Maximum A posteriori estimate can be obtained for
by doing an argmax instead of a sum. -
The Belief Propagation algorithm gives an exact value if the PGM is acyclic.
However, it can still be performed for graphs with cycles with no exact value guarantees.
-
The update rule for BP can be modified as long as the set of fixed points remains the same.
-
For notational brevity, let
denote the message to be passed at the -th step. Then with a damped update rule