Page 292 - Applied statistics and probability for engineers
P. 292
270 Chapter 7/Point Estimation of Parameters and Sampling Distributions
element (X ( ) = min{ , X 2 ,… , X n }) and X n( ) is the largest 7-82. Let X be a random variable with mean μ and variance
X 1
1
2
1 ,
sample element (X ( ) = max{ , X 2 … , X n }). X i( ) is called σ , and let X X 2 ,… , X n be a random sample of size n from X.
,
n
X 1
2
n−1
i−1 X i+ − ) is an unbiased
the ith order statistic. Often the distribution of some of Show that the statistic V = kΣ ( 1 X i
2
the order statistics is of interest, particularly the minimum estimator for σ for an appropriate choice for the constant k.
and maximum sample values X ( )1 and X n( ) , respectively. Find this value for k.
(a) Prove that the cumulative distribution functions of these 7-83. When the population has a normal distribution, the
two order statistics, denoted respectively by F X 1 ( ) t ( ) and estimator
(
…
X
F X n ( ) t ( ), are ˆ σ = median X 1 − X , X 2 − X , , X n − ) / 0 .6745
( t = −[ 1− F t)] n
(
)
F X 1 ( ) 1 is sometimes used to estimate the population standard
(
)
( t = [ F t)] n deviation. This estimator is more robust to outliers than the
F X n ( )
usual sample standard deviation and usually does not differ
(b) Prove that if X is continuous with probability density much from S when there are no unusual observations.
function f x( ), the probability distributions of X ( )1 and (a) Calculate ˆ σ and S for the data 10, 12, 9, 14, 18, 15,
X n( ) are and 16.
−
( ) n 1− ( ) n 1 f t ( ) (b) Replace the irst observation in the sample (10) with 50
t = ⎡
F t ⎤
f X 1 ( ) ⎣ ⎦ and recalculate both S and ˆ σ.
7-84. Censored Data. A common problem in industry is
(
( ) n F t ⎤ n−1 f t)
t = ⎡ ( )
f X n ( ) ⎣ ⎦ life testing of components and systems. In this problem, we
(c) Let X X 2 ,… , X n be a random sample of a Bernoulli assume that lifetime has an exponential distribution with
1 ,
ˆ
parameter λ, so ˆ μ = 1 /
λ = X is an unbiased estimate of μ.
random variable with parameter p. Show that
When n components are tested until failure and the data
…
1 ,
1
P X n ( ) = 1 ) = −(1 − p) n X X 2 , , X n represent actual lifetimes, we have a complete
(
sample, and X is indeed an unbiased estimator of μ. How-
P X = ) 0 = − p n ever, in many situations, the components are only left under
1
(
1 ( )
test until r < n failures have occurred. Let Y 1 be the time
1 ,
(d) Let X X 2 ,… , X n be a random sample of a normal ran- of the irst failure, Y 2 be the time of the second failure,…
,
2
dom variable with mean μ and variance σ . Derive the
and Y r be the time of the last failure. This type of test results
probability density functions of X ( )1 and X n( ) . in censored data. There are n r− units still running when
1 ,
(e) Let X X 2 ,… , X n be a random sample of an exponential the test is terminated. The total accumulated test time at ter-
random variable of parameter λ. Derive the cumulative mination is
distribution functions and probability density functions T r = ∑ Y i + ( − )
r
for X ( )1 and X n( ) . n r Y r
i =1
7-81. Let X X 2 ,… , X n be a random sample of a continu- (a) Show that ˆ μ = T / r is an unbiased estimator for μ.
1 ,
r
ous random variable with cumulative distribution function [Hint: You will need to use the memoryless property of
(
F x). Find the exponential distribution and the results of Exercise
(
⎡
E F X n )⎤ 7-80 for the distribution of the minimum of a sample
( ) ⎦
⎣
from an exponential distribution with parameter λ.]
and (b) It can be shown that V T / r) = ( ) .1 / λ 2 r How does this
r (
⎡
)⎤
E F(X ( ) ⎦ compare to V( )X in the uncensored experiment?
⎣
1
Important Terms and Concepts
FOR SECTION 1-7
Bayes estimator Minimum variance unbiased Parameter estimation Standard error and estimated
Bias in parameter estimation estimator Point estimator standard error of an
Bootstrap method Moment estimator Population or distribution estimator
Central limit theorem Normal distribution as the moments Statistic
Estimator versus estimate sampling distribution of a Posterior distribution Statistical inference
Likelihood function sample mean Prior distribution Unbiased estimator
Maximum likelihood Normal distribution as the Sample moments
estimator sampling distribution Sampling distribution
Mean squared error of an of the difference in two
estimator sample means