Page 67 - Introduction to Statistical Pattern Recognition
P. 67
2 Random Vectors and their Properties 49
4. A two-dimensional random vector becomes [a h IT. [-a -hIT, [-c dIT or
[c -dIT with probability of 1/4 for each.
(a) Compute the expected vector and covariance matrix.
(b) Find the condition for a, h, c, and d to satisfy in order to obtain p = 0.
(c) Find the conditions for (I, h, c and d to satisfy in order to obtain p = + 1
andp=- 1.
5. Let m be the sample mean of N samples, x1 , . . . , xN, drawn from N.,(rn, 02).
Find the expected value and variance of (m-~)~, and confirm that
Var{(r;l-rn)2} -I/N~.
6. Let
and C2 = [I +oy
C1 = [ 1 OS] 0.5 I
0.5 1 1-614
Diagonalize these two matrices simultaneously.
7. Prove that S-' Mand C-'M are the same vector with different lengths.
8. Express a non-zero eigenvalue and the corresponding eigenvector of
C-'MMT in terms of C and M. (Hint: The rank of Z-IMM' is one.)
9. Let S be an n xn matrix, composed of two vectors MI and M, as
S =MIMY +M2MT. The lengths of MI and M2 are 1 and 2 respectively,
and their mutual angle is 60 ". Compute the eigenvalues of S.
10. After the mixture of two distributions is normalized by a shift and a linear
transformation, the expected vectors and covariance matrices satisfy the fol-
lowing equations.
Calculate the followings in terms of Y I , P ?, and M I