Page 111 - Matrix Analysis & Applied Linear Algebra
P. 111
3.6 Properties of Matrix Multiplication 105
3.6 PROPERTIES OF MATRIX MULTIPLICATION
We saw in the previous section that there are some differences between scalar
and matrix algebra—most notable is the fact that matrix multiplication is not
commutative, and there is no cancellation law. But there are also some important
similarities, and the purpose of this section is to look deeper into these issues.
Although we can adjust to not having the commutative property, the situa-
tion would be unbearable if the distributive and associative properties were not
available. Fortunately, both of these properties hold for matrix multiplication.
Distributive and Associative Laws
For conformable matrices each of the following is true.
• A(B + C)= AB + AC (left-hand distributive law).
• (D + E)F = DF + EF (right-hand distributive law).
• A(BC)=(AB)C (associative law).
Proof. To prove the left-hand distributive property, demonstrate the corre-
sponding entries in the matrices A(B + C) and AB + AC are equal. To this
end, use the definition of matrix multiplication to write
[A(B + C)] ij = A i∗ (B + C) ∗j = [A] ik [B + C] kj = [A] ik ([B] kj +[C] kj )
k k
= ([A] ik [B] kj +[A] ik [C] kj )= [A] ik [B] kj + [A] ik [C] kj
k k k
= A i∗ B ∗j + A i∗ C ∗j =[AB] ij +[AC] ij
=[AB + AC] ij .
Since this is true for each i and j, it follows that A(B + C)= AB + AC. The
proof of the right-hand distributive property is similar and is omitted. To prove
the associative law, suppose that B is p × q and C is q × n, and recall from
(3.5.7) that the j th column of BC is a linear combination of the columns in
B. That is,
q
[BC] ∗j = B ∗1 c 1j + B ∗2 c 2j + ··· + B ∗q c qj = B ∗k c kj .
k=1