Go to previous page Go up Go to next page

4.1 Bayesian estimation

We assign a cost function ′ C (θ,θ ) of estimating the true value of θ as ′ θ. We then associate with an estimator ˆθ a conditional risk or cost averaged over all realizations of data x for each value of the parameter θ:
∫ ( ) R (ˆθ) = E [C (ˆθ,θ)] = C ˆθ(x ),θ p(x, θ)dx, (25 ) θ θ X
where X is the set of observations and p(x,θ) is the joint probability distribution of data x and parameter θ. We further assume that there is a certain a priori probability distribution π (θ ) of the parameter θ. We then define the Bayes estimator as the estimator that minimizes the average risk defined as
∫ ∫ ( ) ˆ ˆ ˆ r(θ) = E[Rθ(θ)] = X Θ C θ(x ),θ p (x, θ)π(θ)dθ dx, (26 )
where E is the expectation value with respect to an a priori distribution π, and Θ is the set of observations of the parameter θ. It is not difficult to show that for a commonly used cost function
′ ′ 2 C (θ ,θ) = (θ − θ) , (27 )
the Bayesian estimator is the conditional mean of the parameter θ given data x, i.e.,
∫ ˆ θ(x) = E [θ|x ] = Θ θp(θ|x) dθ, (28 )
UpdateJump To The Next Update Information where p(θ|x) is the conditional probability density of parameter θ given the data x.
  Go to previous page Go up Go to next page