Page 255 - Probability, Random Variables and Random Processes
P. 255
Chapter 7
Estimation Theory
7.1 INTRODUCTION
In this chapter, we present a classical estimation theory. There are two basic types of estimation
problems. In the first type, we are interested in estimating the parameters of one or more r.v.'s, and in
the second type, we are interested in estimating the value of an inaccessible r.v. Y in terms of the
observation of an accessible r.v. X.
7.2 PARAMETER ESTIMATION
Let X be a r.v. with pdf f (x) and X,, . . . , X, a set of n independent r.v.'s each with pdf f (x). The
set of r.v.'s (XI, . . . , X,) is called a random sample (or sample vector) of size n of X. Any real-valued
function of a random sample s(Xl, . . . , X,) is called a statistic.
Let X be a r.v. with pdf f (x; 8) which depends on an unknown parameter 8. Let (XI, . . . , X,) be a
random sample of X. In this case, the joint pdf of X,, . . . , X, is given by
n
xn; 8) = n/(xi; 8)
f(x; 8) =f(~l7 (7.1)
i= 1
where x,, . . . , x, are the values of the observed data taken from the random sample.
An estimator of 8 is any statistic s(X,, . . . , X,), denoted as
O = s(X,, . . , Xn) (7.2)
For a particular set of observations X, = x,, . . . , Xn = x,, the value of the estimator s(x,, . . . , x,) will
be called an estimate of 8 and denoted by 8. Thus an estimator is a r.v. and an estimate is a particular
realization of it. It is not necessary that an estimate of a parameter be one single value; instead, the
estimate could be a range of values. Estimates which specify a single value are called point estimates,
and estimates which specify a range of values are called interval estimates.
7.3 PROPERTIES OF POINT ESTIMATORS
A. Unbiased Estimators:
An estimator O = s(X,, . . . , Xn) is said to be an unbiased estimator of the parameter 8 if
E(0) = 8
for all possible values of 8. If O is an unbiased estimator, then its mean square error is given by
EL(@ - 8)2] = E{[0 - E(@)]~) = Var(0)
That is, its mean square error equals its variance.
B. Efficient Estimators:
An estimator 0, is said to be a more eflcient estimator of the parameter 8 than the estimator O,
if
1. 0, and 0, are both unbiased estimators of 8.