Page 133 - DMTH502_LINEAR_ALGEBRA
P. 133

Unit 9: Representation of Transformations by Matrices




          We motivated the definition (4) of matrix multiplication via operations on the rows of a matrix.  Notes
          One sees here that a very strong motivation for the definition is to be found in composing linear
          transformations. Let us summarize formally.
          Theorem 3: Let V, W, and Z be finite-dimensional vector spaces over the field F; let T be a linear
          transformation from V into W and U a linear transformation from W into Z. If , ' and  ''    are
          ordered basis for the spaces V, W and Z respectively, if A is the matrix of T relative to the pair
            
           , ' and  ''is the matrix of U relative to the pair  ', '',   then the matrix of the composition UT
                  
                            
          relative to the pair   ,  '' is the product matrix C = BA.
          We  remark that Theorem 3 gives a proof that matrix multiplication is associative – a  proof
          which requires no calculations.
          It is important to note that if T and U are linear operators on a space V and we are representing
          by a single ordered basis   , then Theorem 3 assumes the simple form  UT  U  T . Thus in
                                                                             
          this case, the correspondence which   determines between operators and matrices is not only a
          vector space isomorphism but also preserve products. A simple consequence of this is that the
          linear operator T is invertible if and only if  T  is an invertible matrix. For, the identity operator
                                               
          I is represented by the identity matrix in any order basis, and thus
                 UT = TU = I
          is equivalent to

                  U    T    T    U    . I
          Of course, when T is invertible

                  T  1   T  1  .
                          
          Now we should like to inquire what happens to representing matrices when the ordered basis
          is changed. For the sake of simplicity, we shall consider this question only for linear operators
          on a space V, so that we can use a single ordered basis. The specific question is this. Let T be a
          linear operator on the finite-dimensional space V, and let

                               
                  =  1 ,...,  n  and  '=  ' 1  ,...,  ' n

          be  two ordered basis for  V. How  are the  matrices T    and  T  '  '   related? There is a unique
          (invertible) n×n matrix P such that
                       P                                                           ...(5)
                          ' 

          for every vector  in V. It is the matrix  P  P  ,...,P where  Pj  '  . By definition
                                               1   n            j  
                  T     T     .                                                    ...(6)
                           
          Applying (5) to the vector T  ,we have
                  T     P T   .                                                    ...(7)
                            ' 
          Combining (5), (6) and (7), we obtain

                  T    P  '   P T  ' 

                  P  1  T  P   T
                           '    ' 



                                           LOVELY PROFESSIONAL UNIVERSITY                                   127
   128   129   130   131   132   133   134   135   136   137   138