Page 29 - A Course in Linear Algebra with Applications
P. 29
1.2: Operations with Matrices 13
(f) A(B + C) = AB + AC, {distributive law);
(g) (A + B)C = AC + BC, (distributive law);
(h) A-B = A + (-l)B;
(i) (cd)A = c(dA);
(i)c(AB) = (cA)B = A(cB);
(k) c(A + B) = cA + cB;
(1) (c + d)A = cA + dA;
T
(m) (A + B) T = A T + B ;
T T
(n) (AB) T = B A .
Each of these laws is a logical consequence of the defini-
tions of the various matrix operations. To give formal proofs
of them all is a lengthy, but routine, task; an example of such a
proof will be given shortly. It must be stressed that familiarity
with these laws is essential if matrices are to be manipulated
correctly.
We remark that it is unambiguous to use the expression
A + B + C for both (A + B) + C and A+(B + C). For by
the associative law of addition these matrices are equal. The
same comment applies to sums like A + B + C + D , and also
to matrix products such as (AB)C and A(BC), both of which
are written as ABC.
In order to illustrate the use of matrix operations, we
shall now work out three problems.
Example 1.2.6
Prove the associative law for matrix multiplication, (AB)C =
A(BC) where A, B, C are mxn, nxp, pxq matrices re-
spectively.
In the first place observe that all the products mentioned
exist, and that both (AB)C and A(BC) are m x q matrices.
To show that they are equal, we need to verify that their (i, j)
j
entries are the same for all i and .