2. Conditioning and Independence
Conditioning leads to revised ("conditional") probabilities that take into account partial information on the outcome of a probabilistic experiment. Conditioning is a very useful tool that allows us to "divide and conquer" complex problems. Independence is used to model situations involving non-interacting probabilistic phenomena and also plays an important role in building complex models from more elementary ones.
Conditioning and Bayes' rule
- Conditional Probability
The conditional probability of an event given another event is the probability of their intersection divided by the probability of the conditioning event.
- Three important tools
- Multiplication rule
- Total probability theorem
- Bayes' rule - provides a systematic way for incorporating new evidence into a probability model. Foundation of the field of inference. It is a guide on how to process data and make inferences about unobserved quantities or phenomena.
P ( A | B ) = Probability of A given B
P ( A | B ) = Here B is called the conditioning event
Conditional Probabilities share probabilities of ordinary probabilities
- Conditional probabilities must be non-negative
Inference - Having observed B, we make inferences as to how likely a particular scenario Ai, is going to be.