Go to previous page Go up Go to next page

3.1 Bayesian approach

In the Bayesian approach we assign costs to our decisions; in particular we introduce positive numbers Cij, i,j = 0,1, where Cij is the cost incurred by choosing hypothesis Hi when hypothesis Hj is true. We define the conditional risk R of a decision rule δ for each hypothesis as
′ Rj (δ) = C0jPj (ℛ ) + C1jPj (ℛ ), j = 0,1, (17 )
where Pj is the probability distribution of the data when hypothesis Hj is true. Next we assign probabilities π0 and π1 = 1 − π0 to the occurrences of hypothesis H0 and H1, respectively. These probabilities are called a priori probabilities or priors. We define the Bayes risk as the overall average cost incurred by the decision rule δ:
r(δ) = π0R0 (δ) + π1R1(δ). (18 )
Finally we define the Bayes rule as the rule that minimizes the Bayes risk r(δ).
  Go to previous page Go up Go to next page