Page 73 - Statistics and Data Analysis in Geology
P. 73

Matrix Algebra

             These are complex numbers, containing both real parts and imaginary parts which
             include the imaginary number, i = a. Fortunately, a symmetric matrix always
             yields real eigenvalues, and most of  our computations involving eigenvalues and
             eigenvectors will utilize covariance, correlation,  or  similarity matrices which are
             always symmetrical.
                 Next, we will consider the eigenvalues of  the third-order matrix:

                                              20  -4
                                           [ -40   8  -2:]
                                            -60   12  -26




                                       20-h    -4       8
                                        -40   8-h      -20    =O
                                        -60     12   -26-h

             Expanding out the determinant and combining terms yields

                                          -A3  + 2h2 + 8h = 0

                 This is a cubic equation having three roots that must be found. In this instance,
             the polynomial can be factored into

                                        (A - 4) (A - 0) (A + 2) = 0

             and the roots are directly obtainable:

                                     h1=+4      h2=O      &=-2

                 Although the techniques we have been using are extendible to any size matrix,
             finding the roots of  large polynomial equations can be an arduous task.  Usually,
             eigenvalues are not found by solution of  a polynomial equation, but rather by ma-
             trix manipulation methods that involve refinement of a successive series of approx-
             imations to the eigenvalues. These methods are practical only because of the great
             computational speed of  digital computers.  Utilizing this speed, a researcher  can
             compress literally a lifetime of trial solutions and refinements into a few minutes.
                 We can now define another measure of  the “size” of a square matrix. The rank
             of a square matrix is the number of  independent rows (or columns) in the matrix
             and is equal to the number of  nonzero eigenvalues that can be extracted from the
             matrix.  A nonsingular matrix has as many nonzero eigenvalues as there are rows
             or columns in the matrix, so its rank is equal to its order.  A singular matrix has
             one or more rows or columns that are dependent on other rows or columns, and
              consequently will have one or more zero eigenvalues; its rank will be less than its
              order.
                  Now that we have an idea of  the manipulations that produce eigenvalues, we
             may try to get some insight into their nature. The rows of a matrix can be regarded
              as the coordinates of points in m-dimensional space. If we restrict our considera-
              tion to 2 x 2 matrices, we can represent this space as an illustration on a page and
              can view matrix operations geometrically.

                                                                                      145
   68   69   70   71   72   73   74   75   76   77   78