Let $V$ be an inner product space. suppose $S={v_1,v_2, … v_n}$ is an orthogonal set of nonzero vectors...












2












$begingroup$


Let $V$ be an inner product space. suppose $S={v_1,v_2, ... v_n}$ is an orthogonal set of nonzero vectors in $V$ such that $V = text{Span}(S)$. Prove that $S$ is a basis for $V$.



Problem










share|cite|improve this question











$endgroup$



closed as off-topic by Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo Dec 13 '18 at 8:53


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo

If this question can be reworded to fit the rules in the help center, please edit the question.













  • $begingroup$
    I edited your question to clean up the $LaTeX$ a little bit. Cheers!
    $endgroup$
    – Robert Lewis
    Dec 2 '18 at 19:29










  • $begingroup$
    It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
    $endgroup$
    – Dave
    Dec 2 '18 at 20:09










  • $begingroup$
    Thanks Dave got it right :)
    $endgroup$
    – Anas
    Dec 3 '18 at 4:24






  • 1




    $begingroup$
    @RobertLewis Haha thanks man!
    $endgroup$
    – Anas
    Dec 3 '18 at 4:25










  • $begingroup$
    My pleasure my friend! 😁😁😁
    $endgroup$
    – Robert Lewis
    Dec 3 '18 at 4:31
















2












$begingroup$


Let $V$ be an inner product space. suppose $S={v_1,v_2, ... v_n}$ is an orthogonal set of nonzero vectors in $V$ such that $V = text{Span}(S)$. Prove that $S$ is a basis for $V$.



Problem










share|cite|improve this question











$endgroup$



closed as off-topic by Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo Dec 13 '18 at 8:53


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo

If this question can be reworded to fit the rules in the help center, please edit the question.













  • $begingroup$
    I edited your question to clean up the $LaTeX$ a little bit. Cheers!
    $endgroup$
    – Robert Lewis
    Dec 2 '18 at 19:29










  • $begingroup$
    It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
    $endgroup$
    – Dave
    Dec 2 '18 at 20:09










  • $begingroup$
    Thanks Dave got it right :)
    $endgroup$
    – Anas
    Dec 3 '18 at 4:24






  • 1




    $begingroup$
    @RobertLewis Haha thanks man!
    $endgroup$
    – Anas
    Dec 3 '18 at 4:25










  • $begingroup$
    My pleasure my friend! 😁😁😁
    $endgroup$
    – Robert Lewis
    Dec 3 '18 at 4:31














2












2








2


1



$begingroup$


Let $V$ be an inner product space. suppose $S={v_1,v_2, ... v_n}$ is an orthogonal set of nonzero vectors in $V$ such that $V = text{Span}(S)$. Prove that $S$ is a basis for $V$.



Problem










share|cite|improve this question











$endgroup$




Let $V$ be an inner product space. suppose $S={v_1,v_2, ... v_n}$ is an orthogonal set of nonzero vectors in $V$ such that $V = text{Span}(S)$. Prove that $S$ is a basis for $V$.



Problem







linear-algebra






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 2 '18 at 19:29









Robert Lewis

44.5k22964




44.5k22964










asked Dec 2 '18 at 19:24









AnasAnas

135




135




closed as off-topic by Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo Dec 13 '18 at 8:53


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo

If this question can be reworded to fit the rules in the help center, please edit the question.




closed as off-topic by Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo Dec 13 '18 at 8:53


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Adrian Keister, Chris Custer, Brahadeesh, KReiser, Cesareo

If this question can be reworded to fit the rules in the help center, please edit the question.












  • $begingroup$
    I edited your question to clean up the $LaTeX$ a little bit. Cheers!
    $endgroup$
    – Robert Lewis
    Dec 2 '18 at 19:29










  • $begingroup$
    It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
    $endgroup$
    – Dave
    Dec 2 '18 at 20:09










  • $begingroup$
    Thanks Dave got it right :)
    $endgroup$
    – Anas
    Dec 3 '18 at 4:24






  • 1




    $begingroup$
    @RobertLewis Haha thanks man!
    $endgroup$
    – Anas
    Dec 3 '18 at 4:25










  • $begingroup$
    My pleasure my friend! 😁😁😁
    $endgroup$
    – Robert Lewis
    Dec 3 '18 at 4:31


















  • $begingroup$
    I edited your question to clean up the $LaTeX$ a little bit. Cheers!
    $endgroup$
    – Robert Lewis
    Dec 2 '18 at 19:29










  • $begingroup$
    It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
    $endgroup$
    – Dave
    Dec 2 '18 at 20:09










  • $begingroup$
    Thanks Dave got it right :)
    $endgroup$
    – Anas
    Dec 3 '18 at 4:24






  • 1




    $begingroup$
    @RobertLewis Haha thanks man!
    $endgroup$
    – Anas
    Dec 3 '18 at 4:25










  • $begingroup$
    My pleasure my friend! 😁😁😁
    $endgroup$
    – Robert Lewis
    Dec 3 '18 at 4:31
















$begingroup$
I edited your question to clean up the $LaTeX$ a little bit. Cheers!
$endgroup$
– Robert Lewis
Dec 2 '18 at 19:29




$begingroup$
I edited your question to clean up the $LaTeX$ a little bit. Cheers!
$endgroup$
– Robert Lewis
Dec 2 '18 at 19:29












$begingroup$
It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
$endgroup$
– Dave
Dec 2 '18 at 20:09




$begingroup$
It really boils down to showing that orthogonality implies linear independence; see Anurag A's answer.
$endgroup$
– Dave
Dec 2 '18 at 20:09












$begingroup$
Thanks Dave got it right :)
$endgroup$
– Anas
Dec 3 '18 at 4:24




$begingroup$
Thanks Dave got it right :)
$endgroup$
– Anas
Dec 3 '18 at 4:24




1




1




$begingroup$
@RobertLewis Haha thanks man!
$endgroup$
– Anas
Dec 3 '18 at 4:25




$begingroup$
@RobertLewis Haha thanks man!
$endgroup$
– Anas
Dec 3 '18 at 4:25












$begingroup$
My pleasure my friend! 😁😁😁
$endgroup$
– Robert Lewis
Dec 3 '18 at 4:31




$begingroup$
My pleasure my friend! 😁😁😁
$endgroup$
– Robert Lewis
Dec 3 '18 at 4:31










3 Answers
3






active

oldest

votes


















3












$begingroup$

Consider,
$$c_1v_1+c_2v_2+ dotsb +c_nv_n =mathbf{0}.$$
Now take the inner product with vector $v_k$, to get
$$c_1 langle v_1, v_k rangle +c_2langle v_2, v_k rangle+ dotsb +c_klangle v_k, v_k rangle + dotsb +c_nlangle v_n, v_k rangle=0.$$
From orthogonality it follows that all terms on the left side are zero, except one term. So we get
$$c_k langle v_k, v_k rangle =c_k|v_k|^2=0.$$
But $v_k$ is a nonzero vector. Thus $c_k=0$. Likewise we can have all the coefficients as $0$. Thus the set is linearly independent and since it already spans, therefore a basis.






share|cite|improve this answer









$endgroup$





















    1












    $begingroup$

    I am going to assume that $V$ is an inner product space over the real field $Bbb R$; where the inner product is a bilinear mapping



    $langle cdot, cdot rangle: V times V to Bbb R tag 1$



    such that, for all $v in V$,



    $Vert v Vert^2 = langle v, v rangle ge 0, tag 2$



    where equality holds if and only if



    $v = 0; tag 3$



    then if a linear dependence existed between the elements of $S$, we would have



    $alpha_i in Bbb R, ; 1 le i le n, tag 4$



    such that



    $exists i, ; 1 le i le n, alpha_i ne 0, tag 5$



    that is, not all the $alpha_i$ vanish, such that



    $displaystyle sum_1^n alpha_i v_i = 0. tag 6$



    If we take the inner product of each side of this equation with any $v_j$, we find that



    $alpha_j langle v_j, v_j rangle = displaystyle sum_1^n alpha_i langle v_j, v_i rangle = left langle v_j, displaystyle sum_1^n alpha_i v_i right rangle = langle v_j, 0 rangle = 0, tag 7$



    since $i ne j$ implies



    $langle v_j, v_i rangle = 0 tag 8$



    by the orthogonality of the members of $S$. Since



    $v_j ne 0, ; 1 le j le n, tag 9$



    we have



    $langle v_j, v_j rangle ne 0, tag{10}$



    whence (7) yields



    $alpha_j = 0, ; 1 le j le n; tag{11}$



    it follows that the set $S$ is linearly independent over $Bbb R$, and hence since



    $V = text{Span}(S), tag{12}$



    that $S$ is a basis for $V$.






    share|cite|improve this answer









    $endgroup$





















      0












      $begingroup$

      Since the vectors $v_1,ldots,v_n$ are orthogonal to each other, they are all linear independent. So we have $n$ linear independent vectors of the right dimension, thus $S$ is a basis for $V$.



      Edit:
      Why are the vectors linear independent?
      First, checkout the definition here.



      For any linear combination $v = sum_{j=1}^n a_j v_j$ of the non-zero vectors $v_j$ we know that if $v = 0$ holds, then all $a_j$ must be zero to fulfill the equality (since all $v_j neq 0$), thus fulfilling the definition of linear independence.






      share|cite|improve this answer











      $endgroup$




















        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        3












        $begingroup$

        Consider,
        $$c_1v_1+c_2v_2+ dotsb +c_nv_n =mathbf{0}.$$
        Now take the inner product with vector $v_k$, to get
        $$c_1 langle v_1, v_k rangle +c_2langle v_2, v_k rangle+ dotsb +c_klangle v_k, v_k rangle + dotsb +c_nlangle v_n, v_k rangle=0.$$
        From orthogonality it follows that all terms on the left side are zero, except one term. So we get
        $$c_k langle v_k, v_k rangle =c_k|v_k|^2=0.$$
        But $v_k$ is a nonzero vector. Thus $c_k=0$. Likewise we can have all the coefficients as $0$. Thus the set is linearly independent and since it already spans, therefore a basis.






        share|cite|improve this answer









        $endgroup$


















          3












          $begingroup$

          Consider,
          $$c_1v_1+c_2v_2+ dotsb +c_nv_n =mathbf{0}.$$
          Now take the inner product with vector $v_k$, to get
          $$c_1 langle v_1, v_k rangle +c_2langle v_2, v_k rangle+ dotsb +c_klangle v_k, v_k rangle + dotsb +c_nlangle v_n, v_k rangle=0.$$
          From orthogonality it follows that all terms on the left side are zero, except one term. So we get
          $$c_k langle v_k, v_k rangle =c_k|v_k|^2=0.$$
          But $v_k$ is a nonzero vector. Thus $c_k=0$. Likewise we can have all the coefficients as $0$. Thus the set is linearly independent and since it already spans, therefore a basis.






          share|cite|improve this answer









          $endgroup$
















            3












            3








            3





            $begingroup$

            Consider,
            $$c_1v_1+c_2v_2+ dotsb +c_nv_n =mathbf{0}.$$
            Now take the inner product with vector $v_k$, to get
            $$c_1 langle v_1, v_k rangle +c_2langle v_2, v_k rangle+ dotsb +c_klangle v_k, v_k rangle + dotsb +c_nlangle v_n, v_k rangle=0.$$
            From orthogonality it follows that all terms on the left side are zero, except one term. So we get
            $$c_k langle v_k, v_k rangle =c_k|v_k|^2=0.$$
            But $v_k$ is a nonzero vector. Thus $c_k=0$. Likewise we can have all the coefficients as $0$. Thus the set is linearly independent and since it already spans, therefore a basis.






            share|cite|improve this answer









            $endgroup$



            Consider,
            $$c_1v_1+c_2v_2+ dotsb +c_nv_n =mathbf{0}.$$
            Now take the inner product with vector $v_k$, to get
            $$c_1 langle v_1, v_k rangle +c_2langle v_2, v_k rangle+ dotsb +c_klangle v_k, v_k rangle + dotsb +c_nlangle v_n, v_k rangle=0.$$
            From orthogonality it follows that all terms on the left side are zero, except one term. So we get
            $$c_k langle v_k, v_k rangle =c_k|v_k|^2=0.$$
            But $v_k$ is a nonzero vector. Thus $c_k=0$. Likewise we can have all the coefficients as $0$. Thus the set is linearly independent and since it already spans, therefore a basis.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Dec 2 '18 at 19:44









            Anurag AAnurag A

            25.8k12249




            25.8k12249























                1












                $begingroup$

                I am going to assume that $V$ is an inner product space over the real field $Bbb R$; where the inner product is a bilinear mapping



                $langle cdot, cdot rangle: V times V to Bbb R tag 1$



                such that, for all $v in V$,



                $Vert v Vert^2 = langle v, v rangle ge 0, tag 2$



                where equality holds if and only if



                $v = 0; tag 3$



                then if a linear dependence existed between the elements of $S$, we would have



                $alpha_i in Bbb R, ; 1 le i le n, tag 4$



                such that



                $exists i, ; 1 le i le n, alpha_i ne 0, tag 5$



                that is, not all the $alpha_i$ vanish, such that



                $displaystyle sum_1^n alpha_i v_i = 0. tag 6$



                If we take the inner product of each side of this equation with any $v_j$, we find that



                $alpha_j langle v_j, v_j rangle = displaystyle sum_1^n alpha_i langle v_j, v_i rangle = left langle v_j, displaystyle sum_1^n alpha_i v_i right rangle = langle v_j, 0 rangle = 0, tag 7$



                since $i ne j$ implies



                $langle v_j, v_i rangle = 0 tag 8$



                by the orthogonality of the members of $S$. Since



                $v_j ne 0, ; 1 le j le n, tag 9$



                we have



                $langle v_j, v_j rangle ne 0, tag{10}$



                whence (7) yields



                $alpha_j = 0, ; 1 le j le n; tag{11}$



                it follows that the set $S$ is linearly independent over $Bbb R$, and hence since



                $V = text{Span}(S), tag{12}$



                that $S$ is a basis for $V$.






                share|cite|improve this answer









                $endgroup$


















                  1












                  $begingroup$

                  I am going to assume that $V$ is an inner product space over the real field $Bbb R$; where the inner product is a bilinear mapping



                  $langle cdot, cdot rangle: V times V to Bbb R tag 1$



                  such that, for all $v in V$,



                  $Vert v Vert^2 = langle v, v rangle ge 0, tag 2$



                  where equality holds if and only if



                  $v = 0; tag 3$



                  then if a linear dependence existed between the elements of $S$, we would have



                  $alpha_i in Bbb R, ; 1 le i le n, tag 4$



                  such that



                  $exists i, ; 1 le i le n, alpha_i ne 0, tag 5$



                  that is, not all the $alpha_i$ vanish, such that



                  $displaystyle sum_1^n alpha_i v_i = 0. tag 6$



                  If we take the inner product of each side of this equation with any $v_j$, we find that



                  $alpha_j langle v_j, v_j rangle = displaystyle sum_1^n alpha_i langle v_j, v_i rangle = left langle v_j, displaystyle sum_1^n alpha_i v_i right rangle = langle v_j, 0 rangle = 0, tag 7$



                  since $i ne j$ implies



                  $langle v_j, v_i rangle = 0 tag 8$



                  by the orthogonality of the members of $S$. Since



                  $v_j ne 0, ; 1 le j le n, tag 9$



                  we have



                  $langle v_j, v_j rangle ne 0, tag{10}$



                  whence (7) yields



                  $alpha_j = 0, ; 1 le j le n; tag{11}$



                  it follows that the set $S$ is linearly independent over $Bbb R$, and hence since



                  $V = text{Span}(S), tag{12}$



                  that $S$ is a basis for $V$.






                  share|cite|improve this answer









                  $endgroup$
















                    1












                    1








                    1





                    $begingroup$

                    I am going to assume that $V$ is an inner product space over the real field $Bbb R$; where the inner product is a bilinear mapping



                    $langle cdot, cdot rangle: V times V to Bbb R tag 1$



                    such that, for all $v in V$,



                    $Vert v Vert^2 = langle v, v rangle ge 0, tag 2$



                    where equality holds if and only if



                    $v = 0; tag 3$



                    then if a linear dependence existed between the elements of $S$, we would have



                    $alpha_i in Bbb R, ; 1 le i le n, tag 4$



                    such that



                    $exists i, ; 1 le i le n, alpha_i ne 0, tag 5$



                    that is, not all the $alpha_i$ vanish, such that



                    $displaystyle sum_1^n alpha_i v_i = 0. tag 6$



                    If we take the inner product of each side of this equation with any $v_j$, we find that



                    $alpha_j langle v_j, v_j rangle = displaystyle sum_1^n alpha_i langle v_j, v_i rangle = left langle v_j, displaystyle sum_1^n alpha_i v_i right rangle = langle v_j, 0 rangle = 0, tag 7$



                    since $i ne j$ implies



                    $langle v_j, v_i rangle = 0 tag 8$



                    by the orthogonality of the members of $S$. Since



                    $v_j ne 0, ; 1 le j le n, tag 9$



                    we have



                    $langle v_j, v_j rangle ne 0, tag{10}$



                    whence (7) yields



                    $alpha_j = 0, ; 1 le j le n; tag{11}$



                    it follows that the set $S$ is linearly independent over $Bbb R$, and hence since



                    $V = text{Span}(S), tag{12}$



                    that $S$ is a basis for $V$.






                    share|cite|improve this answer









                    $endgroup$



                    I am going to assume that $V$ is an inner product space over the real field $Bbb R$; where the inner product is a bilinear mapping



                    $langle cdot, cdot rangle: V times V to Bbb R tag 1$



                    such that, for all $v in V$,



                    $Vert v Vert^2 = langle v, v rangle ge 0, tag 2$



                    where equality holds if and only if



                    $v = 0; tag 3$



                    then if a linear dependence existed between the elements of $S$, we would have



                    $alpha_i in Bbb R, ; 1 le i le n, tag 4$



                    such that



                    $exists i, ; 1 le i le n, alpha_i ne 0, tag 5$



                    that is, not all the $alpha_i$ vanish, such that



                    $displaystyle sum_1^n alpha_i v_i = 0. tag 6$



                    If we take the inner product of each side of this equation with any $v_j$, we find that



                    $alpha_j langle v_j, v_j rangle = displaystyle sum_1^n alpha_i langle v_j, v_i rangle = left langle v_j, displaystyle sum_1^n alpha_i v_i right rangle = langle v_j, 0 rangle = 0, tag 7$



                    since $i ne j$ implies



                    $langle v_j, v_i rangle = 0 tag 8$



                    by the orthogonality of the members of $S$. Since



                    $v_j ne 0, ; 1 le j le n, tag 9$



                    we have



                    $langle v_j, v_j rangle ne 0, tag{10}$



                    whence (7) yields



                    $alpha_j = 0, ; 1 le j le n; tag{11}$



                    it follows that the set $S$ is linearly independent over $Bbb R$, and hence since



                    $V = text{Span}(S), tag{12}$



                    that $S$ is a basis for $V$.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Dec 2 '18 at 20:11









                    Robert LewisRobert Lewis

                    44.5k22964




                    44.5k22964























                        0












                        $begingroup$

                        Since the vectors $v_1,ldots,v_n$ are orthogonal to each other, they are all linear independent. So we have $n$ linear independent vectors of the right dimension, thus $S$ is a basis for $V$.



                        Edit:
                        Why are the vectors linear independent?
                        First, checkout the definition here.



                        For any linear combination $v = sum_{j=1}^n a_j v_j$ of the non-zero vectors $v_j$ we know that if $v = 0$ holds, then all $a_j$ must be zero to fulfill the equality (since all $v_j neq 0$), thus fulfilling the definition of linear independence.






                        share|cite|improve this answer











                        $endgroup$


















                          0












                          $begingroup$

                          Since the vectors $v_1,ldots,v_n$ are orthogonal to each other, they are all linear independent. So we have $n$ linear independent vectors of the right dimension, thus $S$ is a basis for $V$.



                          Edit:
                          Why are the vectors linear independent?
                          First, checkout the definition here.



                          For any linear combination $v = sum_{j=1}^n a_j v_j$ of the non-zero vectors $v_j$ we know that if $v = 0$ holds, then all $a_j$ must be zero to fulfill the equality (since all $v_j neq 0$), thus fulfilling the definition of linear independence.






                          share|cite|improve this answer











                          $endgroup$
















                            0












                            0








                            0





                            $begingroup$

                            Since the vectors $v_1,ldots,v_n$ are orthogonal to each other, they are all linear independent. So we have $n$ linear independent vectors of the right dimension, thus $S$ is a basis for $V$.



                            Edit:
                            Why are the vectors linear independent?
                            First, checkout the definition here.



                            For any linear combination $v = sum_{j=1}^n a_j v_j$ of the non-zero vectors $v_j$ we know that if $v = 0$ holds, then all $a_j$ must be zero to fulfill the equality (since all $v_j neq 0$), thus fulfilling the definition of linear independence.






                            share|cite|improve this answer











                            $endgroup$



                            Since the vectors $v_1,ldots,v_n$ are orthogonal to each other, they are all linear independent. So we have $n$ linear independent vectors of the right dimension, thus $S$ is a basis for $V$.



                            Edit:
                            Why are the vectors linear independent?
                            First, checkout the definition here.



                            For any linear combination $v = sum_{j=1}^n a_j v_j$ of the non-zero vectors $v_j$ we know that if $v = 0$ holds, then all $a_j$ must be zero to fulfill the equality (since all $v_j neq 0$), thus fulfilling the definition of linear independence.







                            share|cite|improve this answer














                            share|cite|improve this answer



                            share|cite|improve this answer








                            edited Dec 2 '18 at 19:44

























                            answered Dec 2 '18 at 19:27









                            Thomas LangThomas Lang

                            1624




                            1624















                                Popular posts from this blog

                                Probability when a professor distributes a quiz and homework assignment to a class of n students.

                                Aardman Animations

                                Are they similar matrix