Part 22, Section 5: Computing Branch Probabiilities with Bayes' Theorem
This chapter covers basic information regarding the methods used by R for organizing and graphing data, respectively.
Bayes’ theorem can be used to compute branch probabilities for decision trees
The notation | in P(s1|F) and P(s2|F) is read as “given” and indicates a conditional probability because we are interested in the probability of a particular state of nature “conditioned” on the fact that we receive a favorable market report P(s1|F) and P(s2|F) are referred to as posterior probabilities because they are conditional probabilities based on the outcome of the sample information Example: The PDC decision treefigure ?
F = Favorable market research report U = Unfavorable market research report S1 = Strong demand (state of nature 1) S2 = Weak demand (state of nature 2) In performing the probability computations, we need to know PDC’s assessment of the probabilities of the two states of nature, P(s1) and P(s2) We must know the conditional probability of the market research outcomes given each state of nature To carry out the probability calculations, we need conditional probabilities for all sample outcomes given all states of nature In the PDC problem we assume that the following assessments are available for these conditional probabilities:
State of Nature | Favorable, F | Unfavorable, U |
---|---|---|
Strong demandm s1 | Pr[ F | s1 ] = 0.90 | Pr[ U | s1 ] = 0.10 |
Weak demand, s2 | Pr[ F | s2 ] = 0.25 | Pr[ U | s2 ] = 0.75 |
Bayes’ Theorem
\[ \Pr \left[ A)i \,|\, B \right] = \frac{\Pr [A_i ]\.\Pr [B\, |\, A_i ]}{\Pr [A_1 ]\.\Pr [B\, |\, A_1 ] + \Pr [A_2 ]\.\Pr [B\, |\, A_2 ] + \cdots } . \]