Page 197 - Advanced engineering mathematics
P. 197

6.6 Orthogonal Complements and Projections  177


                                        Finally, let

                                                                  X 3 · V 1  X 3 · V 2
                                                          V 3 =X 3 −    V 1 −     V 2
                                                                    V 1   2    V 2   2
                                                            = < 1,0,0,0,−5,0,0 > + < 1,2,0,0,2,0,0 >
                                                                63
                                                              +    < −8/9,−7/9,0,0,11/9,0,0 >
                                                                26
                                                            = < −2/13,3/26,0,0,−1/26,0,0 >.
                                        Then V 1 ,V 2 ,V 3 form an orthogonal basis for S.



                               SECTION 6.5        PROBLEMS



                            In each of Problems 1 through 8, use the Gram-Schmidt  5. < 0,0,2,2,1 >,< 0,0,1,−1,5 >,< 0,1,−2,1,0 >,
                            process to find an orthogonal basis spanning the same  < 0,1,1,2,0 > in R 5
                                      n
                            subspace of R as the given set of vectors.
                                                                           6. < 1,2,0,−1,2,0 >,< 3,1,−3,−4,0,0 >,< 0,−1,
                                                    3
                            1. < 1,4,0 >,< 2,−5,0 > in R .                   0,−5,0,0 >,< 1,−6,4,−2,−3,0 > in R 6
                            2. < 0,−1,2,0 >,< 0,3,−4,0 > in R 4
                                                                           7. < 0,0,1,1,0,0 >,< 0,0,−3,0,0,0 > in R 6
                            3. < 0,2,1,−1 >,< 0,−1,1,6 >,< 0,2,2,3 > in R  4
                            4. <−1,0,3,0,4>,<4,0,−1,0,3>,<0,0,−1,0,5>      8. <0,−2,0,−2,0,−2>,<0,1,0,−1,0,0>,<0,−4,
                              in R 5                                         0,0,0,6 > in R  6



                            6.6         Orthogonal Complements and Projections

                                        The Gram-Schmidt process serves as a springboard to an important concept that has practical
                                        consequences, including the rationale for least squares approximations (see Section 7.8).


                                                                                                n
                                                              n
                                                                          ⊥
                                          Let S be a subspace of R . Denote by S the set of all vectors in R that are orthogonal to
                                                                                                n
                                                          ⊥
                                          every vector in S. S is called the orthogonal complement of S in R .
                                                           3
                                           For example, in R , suppose S is the two-dimensional subspace having < 1,0,0 > and
                                                                                         ⊥
                                        < 0,1,0 > as basis. We think of S as the x, y - plane. Now S consists of all vectors in 3-space
                                        that are perpendicular to this plane, hence all constant multiples of k.
                                                                          3
                                           In this example, S is a subspace of R . We claim that this is always true.
                                                          ⊥
                                  THEOREM 6.6

                                                          n
                                        If S is a subspace of R , then S is also a subspace of R . Further, the only vector in both S and
                                                                                     n
                                                                 ⊥
                                        S is the zero vector.
                                         ⊥
                                        Proof  The zero vector is certainly in S because O is orthogonal to every vector, hence to
                                                                          ⊥
                                        every vector in S.


                      Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
                      Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.

                                   October 14, 2010  14:21  THM/NEIL   Page-177        27410_06_ch06_p145-186
   192   193   194   195   196   197   198   199   200   201   202