Page 336 - Probability and Statistical Inference
P. 336
6. Sufficiency, Completeness, and Ancillarity 313
With , from (3.6.2), one can write down the joint
pdf of (X, Y):
In order to derive the expression for the Fisher information I (ρ), one may
X,Y
proceed with the natural logarithm of f(x, y; ρ). Next, differentiate the natural
logarithm with respect to ρ, followed by squaring it, and then evaluating the
expectation of that expression after replacing (x, y) with (X, Y). This direct
approach becomes quite involved and it is left as an exercise. Let us, how-
ever, adopt a different approach in the following example.
Example 6.5.8 (Example 6.5.7 Continued) Suppose that (X, Y) is distrib-
uted as N (0, 0, 1, 1, ρ), where the unknown parameter is the correlation
2
coefficient ρ ∈ (1, 1). Define U = X - Y, V = X + Y, and notice that (U,
V) can be uniquely obtained from (X, Y) and vice versa. In other words,
I (ρ) and I (ρ) should be exactly same in view of the Theorem 6.4.4. But,
U,V
X,Y
observe that (U, V) is distributed as N (0, 0, 2(1 ρ), 2(1 + ρ), 0), that is U
2
and V are independent random variables. Refer back to the Section 3.7 as
needed. Hence, using the Theorem 6.4.3, we immediately conclude that I (ρ)
U,V
= I (ρ) + I (ρ). So, we can write:
U V
That is, it will suffice to evaluate I (ρ) and I (ρ) separately. It is clear that U
U
V
is N(0, 2(1 ρ)) while V is N(0, 2(1 + ρ)). The pdf of U is given by
so that one has
Hence, we write
since E [U ] = 2(1 ρ), . Similarly, one would
2
ρ
2
verify that I (ρ) = ½(1 + ρ) . Then, from (6.5.5), we can obviously write:
V