Page 204 -
P. 204

5.2. MATRICES AND DETERMINANTS                     171

                          Remark. If a matrix is real (i.e., all its entries are real), then the corresponding transpose and the adjoint
                       matrix coincide.
                          A square matrix A is said to be normal if A A = AA . A normal matrix A is said to be
                                                                        ∗
                                                                ∗
                                 ∗
                       unitary if A A = AA = I, i.e., A = A –1  (see Paragraph 5.2.1-6).
                                         ∗
                                                    ∗
                       5.2.1-4. Trace of a matrix.
                       The trace of a square matrix A ≡ [a ij ]of size n × n is the sum S of its diagonal entries,
                                                                  n

                                                     S =Tr(A)=      a ii .
                                                                 i=1
                          If λ is a scalar and square matrices A and B has the same size, then

                               Tr(A + B)=Tr(A)+Tr(B),     Tr(λA)= λTr(A),   Tr(AB)= Tr(BA),


                       5.2.1-5. Linear dependence of row vectors (column vectors).
                       A row vector (column vector) B is a linear combination of row vectors (column vectors)
                       A 1 , ... , A k if there exist scalars α 1 , ... , α k such that
                                                   B = α 1 A 1 + ··· + α k A k .
                          Row vectors (column vectors) A 1 , ... , A k are said to be linearly dependent if there
                                              2
                                                       2
                       exist scalars α 1 , ... , α k (α + ··· + α ≠ 0) such that
                                              1
                                                       k
                                                   α 1 A 1 + ··· + α k A k = O,
                       where O is the zero row vector (column vector).
                          Row vectors (column vectors) A 1 , ... , A k are said to be linearly independent if, for
                                       2
                                                2
                       any α 1 , ... , α k (α + ··· + α ≠ 0)we have
                                       1
                                                k
                                                   α 1 A 1 + ··· + α k A k ≠ O.
                          THEOREM. Row vectors (column vectors) A 1 , ... , A k are linearly dependent if and
                       only if one of them is a linear combination of the others.

                       5.2.1-6. Inverse matrices.
                       Let A be a square matrix of size n × n,and let I be the unit matrix of the same size.
                          A square matrix B of size n × n is called a right inverse of A if AB = I. A square
                       matrix C of size n × n is called a left inverse of A if CA = I. If one of the matrices B
                       or C exists, then the other exists, too, and these two matrices coincide. In such a case, the
                       matrix A is said to be nondegenerate (nonsingular).

                          THEOREM. A square matrix is nondegenerate if and only if its rows (columns) are
                       linearly independent.
                          Remark. Generally, instead of the terms “left inverse matrix” and “right inverse matrix”, the term “inverse
                                                         –1
                       matrix” is used with regard to the matrix B = A for a nondegenerate matrix A,since AB = BA = I.
                          UNIQUENESS THEOREM. The matrix A  –1  is the unique matrix satisfying the condition
                                –1
                       AA –1  = A A = I for a given nondegenerate matrix A.
                          Remark. For the existence theorem, see Paragraph 5.2.2-7.
   199   200   201   202   203   204   205   206   207   208   209