Page 389 - Advanced Linear Algebra
P. 389
Tensor Products 373
necessary) that
c
&~ &
~
Then
c c
"n & ~ "n 8 " n &b & 9
~ ~ ~
c c
~ & b " n " & n
~ ~
c
~ ² " " b n & ³
~
But the vectors ¸" b " c ¹ are linearly independent. This
reduction can be repeated until the second coordinates are linearly independent.
Moreover, the identity matrix 0 is a coordinate matrix for and so
'
~ rk ²0 ³ ~ rk ²'³. As to uniqueness, one direction was proved earlier; see
(14.3 ) and the other direction is left to the reader.
The proof of Theorem 14.6 shows that if '£ and
'~ n !
0
where < and ! = , then if the multiset ¸ 0¹ is not linearly
independent, we can rewrite in the form
'
'~ n ! Z
0
where ¸ 0 ¹ is linearly independent. Then we can do the same for the
second coordinate to arrive so at the representation
rk ²%³
'~ % n &
~
where the multisets ¸% ¹ and ¸& ¹ are linearly independent sets. Therefore,
rk²%³ 0(( and so the rank of ' is the smallest integer for which ' can be
written as a sum of decomposable tensors. This is often taken as the
definition of the rank of a tensor.
However, we caution the reader that there is another meaning to the word rank
when applied to a tensor, namely, it is the number of indices required to write
the tensor. Thus, a scalar has rank , a vector has rank , the tensor above has
'
rank and a tensor of the form