Page 195 - DMTH502_LINEAR_ALGEBRA
P. 195
Unit 16: Direct Sum Decompositions of Elementary Canonical Forms
16.2 Direct-sum Decompositions Notes
Definition: Let W ,..., W be subspaces of the vector space V. We say that W ,..., W are independent
1 k 1 k
if
+ ... + = 0, in W
1 k i i
implies that each is 0.
i
For k = 2, the meaning of independence is {0} intersection, i.e., W and W are independent if and
1 2
only if W W = {0}. If k > 2, the independence of W ,..., W says much more than W ... W = {0}.
1 2 1 k 1 k
It says that each W intersects the sum of the other subspaces W only in the zero vector.
j i
The significance of independence is this. Let W = W + ... + W be the subspace spanned by W ,...,
1 k 1
W . Each vector in W can be expressed as a sum
k
= + ... + , in W .
1 k i i
If W ,..., W are independent, then that expression for is unique; for if
1 k
= + ... + , in W
1 k i i
then 0 = ( – ) + ... + ( – ), hence – = 0, i = 1,..., k. Thus, when W ,..., W are independent,
1 1 k k i i 1 k
we can operate with the vectors in W as k-tuples ( ,..., ), in W , in the same way as we operate
1 k i i
k
with vectors in R as k-tuples of numbers.
Lemma: Let V be a finite-dimensional vector space. Let W ,..., W be subspaces of V and let
1 k
W = W + ... + W . The following are equivalent.
1 k
(a) W ,..., W are independent.
1 k
(b) For each j, 2 j k, we have
W (W + ... + W ) = {0}
j 1 j–1
(c) If is an ordered basis for W , 1 i k, then the sequence = ( ,..., ) is an ordered basis
i i 1 k
for W.
Proof: Assume (a). Let be a vector in the intersection W (W + ... + W ). Then there are vectors
j 1 j–1
,..., with in W such that = + ... + . Since
1 j–1 i i 1 j–1
+ ... + + (–) + 0 + ... + 0 = 0
1 j–1
and since W , ..., W are independent, it must be that = = ... = = = 0.
1 k 1 2 j–1
Now, let us observe that (b) implies (a). Suppose
0 = + ... + , in W
1 k i i
Let j be the largest integer i such that 0. Then
i
0 = + ... + , 0.
1 j j
Thus = – – ... – is a non-zero vector in W (W + ... + W ).
j 1 j–1 j 1 j–1
Now that we know (a) and (b) are the same, let us see why (a) is equivalent to (c). Assume (a). Let
be basis for W , 1 i k, and let = ( ,..., ). Any linear relation between the vectors in will
i i 1 k
have the form
+ ... + = 0
1 k
where is some linear combination of the vectors in . Since W ,.., W are independent, each
i i 1 k
is 0. Since each is independent, the relation we have between the vectors in is the trivial
i i
relation.
LOVELY PROFESSIONAL UNIVERSITY 189