Page 59 - Introduction to Statistical Pattern Recognition
P. 59

2 Random Vectors and their Properties                         41



                                                                   I
                          Fortunately,  in  pattern  recognition  problems,  I X  is  rarely  computed
                     directly.  Instead, In IC I  is commonly used, which can be  computed from the
                     eigenvalues as
                                                      n
                                              InICI  = Zlnh,.                  (2.152)
                                                     i=l


                     Fortheaboveexample,InlCI =C,!!,  Inh, +90ln (O.l/9O)=C,!!,  Inhi -612.2.
                          As far as the inverse is concerned, each element of I:-' is given by the ratio
                     of a cofactor (the determinant of an (n -l)x(n - 1) matrix B ) and I X I  . The cofactor
                     is the product of (n -1)  eigenvalues of B, while I  I  is the product of n eigenvalues
                     of Z.  Assuming that (n -I)  eigenvalues of the denominator are, roughly speaking,
                     cancelled out with (n -1)  eigenvalues of the numerator, I B I / I C I  is proportional to
                     l/hk where hk is one of the eigenvalues of X. Therefore, although  I  C I  becomes
                     extremely small as the above example indicates, each element of C-l  does not go
                     up to an extremely large number.  In order to avoid I B I / I I: I = 010 in computation,
                     it is suggested to use the following formula to compute the inverse matrix.

                                                                               (2.153)


                     Again, the eigenvalues and eigenvectors of C are computed first, and then C-'  is
                     obtained by (2.153).  Recall from (2.129) that, if A and @ are the eigenvalue and
                     eigenvector matrices of C, A-I  and 0 are the eigenvalue and eigenvector matrices
                     of C-' . Also, any matrix Q can be expressed by (2.138), using the eigenvalues and
                     eigenvectors.

                     Matrix Inversion

                          Diagonalization of matrices is particularly useful when we need the inverse
                     of matrices.
                          From (2.66), a distance function is expressed by
                               d$(X) = (X - M)'Z-'(X  - M) = (Y - D)'A-l(Y  - D)


                                                                                (2.154)


                     where  D = [d, . . . d,,]'  and A  are the expected vector and diagonal covariance
   54   55   56   57   58   59   60   61   62   63   64