Page 287 - Applied statistics and probability for engineers
P. 287
Section 7-4/Methods of Point Estimation 265
Suppose that we have some additional information about θ and that we can summarize that
θ
information in the form of a probability distribution for θ, say, f ( ). This probability distribu-
tion is often called the prior distribution for θ, and suppose that the mean of the prior is μ 0
2
and the variance is σ 0 . This is a very novel concept insofar as the rest of this book is concerned
because we are now viewing the parameter θ as a random variable. The probabilities associ-
ated with the prior distribution are often called subjective probabilities because they usually
relect the analyst’s degree of belief regarding the true value of θ. The Bayesian approach to
estimation uses the prior distribution for θ, f ( )θ , and the joint probability distribution of the
1 (
x , x , , x n )
sample, say, f x , x , … , x n |θ) , to i nd a posterior distribution for θ, say, f (θ| 1 2 … .
2
This posterior distribution contains information from both the sample and the prior distri-
bution for θ. In a sense, it expresses our degree of belief regarding the true value of θ after
observing the sample data. It is easy conceptually to ind the posterior distribution. The joint
probability distribution of the sample X X 2 ,… , X n and the parameter θ (remember that θ is a
1 ,
random variable) is
θ
θ
θ
f x , x , … , x , ) = f x , x , … , x n ) ( )
f
|
n
( 1
2
( 1
2
and the marginal distribution of X X 2 ,… , X n is
1 ,
θ)
⎧ ∑ ( 1 2 … , x , , θ discrete
f x , x ,
n
⎪
⎪
(
f x , x , … , x n) = ⎨ ∞ 0
2
1
θ
(
⎪ ∫ f x , x , …, x , θ) d , θ continuous
,
2
1
n
⎩ ⎪−∞
Therefore, the desired distribution is
f x , x , , x , )
θ
…
…
f (θ| x , x , , x n ) = ( 1 2 n
2
1
…
f x , x , , x n )
( 1
2
∼
We dei ne the Bayes estimator of θ as the value θ that corresponds to the mean of the poste-
x , x , … , x n ) .
rior distribution f (θ| 1 2
Sometimes the mean of the posterior distribution of θ can be determined easily. As a
x ,… , x n ) is a probability density function and x ,… are just constants.
function of θ, f (θ| 1 1 , x n
θ
Because θ enters into f (θ | x ,… , x n ) only through f x ,… , x , ) if f x ,( 1 … , x , ) because
θ
n
1
( 1
n
a function of θ is recognized as a well-known probability function, the posterior mean of θ
can be deduced from the well-known distribution without integration or even calculation of
f x ,… , x n ) .
( 1
Example 7-17 Bayes Estimator for the Mean of a Normal Distribution Let X X 2 ,… , X n be a random sample
1 ,
2
2
from the normal distribution with mean μ and variance σ where μ is unknown and σ is known.
2
Assume that the prior distribution for μ is normal with mean μ 0 and variance σ 0 ; that is,
1 − μ− μ 0 ) 2 / 2 ( σ ) 1 − μ −2 0 μ + μ ) / 2 ( σ )
2
2
2
2
(
μ
f ( ) = e 0 = e 0 0
(
2 πσ 0 2 πσ 0 2
The joint probability distribution of the sample is
n 2
2
f x , x , … , x n |μ ) = 1 e −( / 1 2È ) ∑ ( x i −μ )
=
2
(2πσ 2 ) n/ 2 i 1
( 1
⎛
2⎞
2 ∑
2
⎜
1 / 1 2È 2 ∑ x − μ x i + nμ ⎟ ⎠
i
= e −( ⎝ )
(2πσ ) )
2 n/ 2