Sum of product of orthonormal vectors [closed]












0












$begingroup$


If $v_1,v_2,...v_n$ are orthonormal vectors in $in mathbb R^n$ is there anything special about $v_1v_1^T+v_2v_2^T+...+v_nv_n^T$










share|cite|improve this question











$endgroup$



closed as off-topic by Saad, Brahadeesh, Shaun, stressed out, user10354138 Dec 20 '18 at 17:10


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Brahadeesh, Shaun, stressed out, user10354138

If this question can be reworded to fit the rules in the help center, please edit the question.
















  • $begingroup$
    Row vectors or column vectors?
    $endgroup$
    – Kavi Rama Murthy
    Dec 20 '18 at 6:06






  • 1




    $begingroup$
    @KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:06










  • $begingroup$
    Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:07








  • 1




    $begingroup$
    @copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:08






  • 3




    $begingroup$
    Then it is the identity since it agrees with the identity on a basis.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:08


















0












$begingroup$


If $v_1,v_2,...v_n$ are orthonormal vectors in $in mathbb R^n$ is there anything special about $v_1v_1^T+v_2v_2^T+...+v_nv_n^T$










share|cite|improve this question











$endgroup$



closed as off-topic by Saad, Brahadeesh, Shaun, stressed out, user10354138 Dec 20 '18 at 17:10


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Brahadeesh, Shaun, stressed out, user10354138

If this question can be reworded to fit the rules in the help center, please edit the question.
















  • $begingroup$
    Row vectors or column vectors?
    $endgroup$
    – Kavi Rama Murthy
    Dec 20 '18 at 6:06






  • 1




    $begingroup$
    @KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:06










  • $begingroup$
    Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:07








  • 1




    $begingroup$
    @copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:08






  • 3




    $begingroup$
    Then it is the identity since it agrees with the identity on a basis.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:08
















0












0








0





$begingroup$


If $v_1,v_2,...v_n$ are orthonormal vectors in $in mathbb R^n$ is there anything special about $v_1v_1^T+v_2v_2^T+...+v_nv_n^T$










share|cite|improve this question











$endgroup$




If $v_1,v_2,...v_n$ are orthonormal vectors in $in mathbb R^n$ is there anything special about $v_1v_1^T+v_2v_2^T+...+v_nv_n^T$







linear-algebra






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 20 '18 at 6:08







Studying Optimization

















asked Dec 20 '18 at 6:04









Studying OptimizationStudying Optimization

917




917




closed as off-topic by Saad, Brahadeesh, Shaun, stressed out, user10354138 Dec 20 '18 at 17:10


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Brahadeesh, Shaun, stressed out, user10354138

If this question can be reworded to fit the rules in the help center, please edit the question.







closed as off-topic by Saad, Brahadeesh, Shaun, stressed out, user10354138 Dec 20 '18 at 17:10


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Brahadeesh, Shaun, stressed out, user10354138

If this question can be reworded to fit the rules in the help center, please edit the question.












  • $begingroup$
    Row vectors or column vectors?
    $endgroup$
    – Kavi Rama Murthy
    Dec 20 '18 at 6:06






  • 1




    $begingroup$
    @KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:06










  • $begingroup$
    Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:07








  • 1




    $begingroup$
    @copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:08






  • 3




    $begingroup$
    Then it is the identity since it agrees with the identity on a basis.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:08




















  • $begingroup$
    Row vectors or column vectors?
    $endgroup$
    – Kavi Rama Murthy
    Dec 20 '18 at 6:06






  • 1




    $begingroup$
    @KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:06










  • $begingroup$
    Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:07








  • 1




    $begingroup$
    @copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
    $endgroup$
    – Studying Optimization
    Dec 20 '18 at 6:08






  • 3




    $begingroup$
    Then it is the identity since it agrees with the identity on a basis.
    $endgroup$
    – copper.hat
    Dec 20 '18 at 6:08


















$begingroup$
Row vectors or column vectors?
$endgroup$
– Kavi Rama Murthy
Dec 20 '18 at 6:06




$begingroup$
Row vectors or column vectors?
$endgroup$
– Kavi Rama Murthy
Dec 20 '18 at 6:06




1




1




$begingroup$
@KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
$endgroup$
– Studying Optimization
Dec 20 '18 at 6:06




$begingroup$
@KaviRamaMurthy column vectors. I guess I have found the answer. It looks like Identity matrix
$endgroup$
– Studying Optimization
Dec 20 '18 at 6:06












$begingroup$
Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
$endgroup$
– copper.hat
Dec 20 '18 at 6:07






$begingroup$
Orthogonal or orthonormal? Let $V$ be the matrix, then $V v_k = |v_k|^2 v_k$.
$endgroup$
– copper.hat
Dec 20 '18 at 6:07






1




1




$begingroup$
@copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
$endgroup$
– Studying Optimization
Dec 20 '18 at 6:08




$begingroup$
@copper.hat they have unit length and orthogonal to each other, so yes they are orthonormal
$endgroup$
– Studying Optimization
Dec 20 '18 at 6:08




3




3




$begingroup$
Then it is the identity since it agrees with the identity on a basis.
$endgroup$
– copper.hat
Dec 20 '18 at 6:08






$begingroup$
Then it is the identity since it agrees with the identity on a basis.
$endgroup$
– copper.hat
Dec 20 '18 at 6:08












3 Answers
3






active

oldest

votes


















1












$begingroup$

If $V = [v_1 ldots v_n]$, we have that $V^T V = I$. This means $V^T = V^{-1}$, but $VV^{-1} = I$ so, $VV^T = I$. Expanding the latter, we get your desired result.






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$

    We have $v_j v_j^T=||v_j||^2=1$ for all $j=1,2,...,n.$ Hence $v_1v_1^T+v_2v_2^T+...+v_nv_n^T=n$.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      I believe the vectors are column vectors so the terms are matrices, not scalars.
      $endgroup$
      – John Douma
      Dec 20 '18 at 6:25










    • $begingroup$
      Yes, it is an identity matrix
      $endgroup$
      – Studying Optimization
      Dec 20 '18 at 6:56



















    0












    $begingroup$

    If $v_i$ is a column vector then multiply $v_1v_1^T+dots+v_nv_n^T$ by any vector $x$:
    $$
    (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)
    $$

    Since $v_i$ make an orthonormal basis, the $v_i^Tx$ is the coefficient in the decomposition of $x$:
    $x=a_1v_1+dots+a_nv_n$ implies $v_i^Tx=v_i^T(a_1v_1+dots+a_nv_n)=a_iv_i^Tv_i=a_i$. Therefore
    $$
    (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)=x
    $$

    which implies that $v_1v_1^T+dots+v_nv_n^T$ is the identity matrix.






    share|cite|improve this answer









    $endgroup$




















      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      If $V = [v_1 ldots v_n]$, we have that $V^T V = I$. This means $V^T = V^{-1}$, but $VV^{-1} = I$ so, $VV^T = I$. Expanding the latter, we get your desired result.






      share|cite|improve this answer









      $endgroup$


















        1












        $begingroup$

        If $V = [v_1 ldots v_n]$, we have that $V^T V = I$. This means $V^T = V^{-1}$, but $VV^{-1} = I$ so, $VV^T = I$. Expanding the latter, we get your desired result.






        share|cite|improve this answer









        $endgroup$
















          1












          1








          1





          $begingroup$

          If $V = [v_1 ldots v_n]$, we have that $V^T V = I$. This means $V^T = V^{-1}$, but $VV^{-1} = I$ so, $VV^T = I$. Expanding the latter, we get your desired result.






          share|cite|improve this answer









          $endgroup$



          If $V = [v_1 ldots v_n]$, we have that $V^T V = I$. This means $V^T = V^{-1}$, but $VV^{-1} = I$ so, $VV^T = I$. Expanding the latter, we get your desired result.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 20 '18 at 7:21









          Ahmad BazziAhmad Bazzi

          8,3622824




          8,3622824























              0












              $begingroup$

              We have $v_j v_j^T=||v_j||^2=1$ for all $j=1,2,...,n.$ Hence $v_1v_1^T+v_2v_2^T+...+v_nv_n^T=n$.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                I believe the vectors are column vectors so the terms are matrices, not scalars.
                $endgroup$
                – John Douma
                Dec 20 '18 at 6:25










              • $begingroup$
                Yes, it is an identity matrix
                $endgroup$
                – Studying Optimization
                Dec 20 '18 at 6:56
















              0












              $begingroup$

              We have $v_j v_j^T=||v_j||^2=1$ for all $j=1,2,...,n.$ Hence $v_1v_1^T+v_2v_2^T+...+v_nv_n^T=n$.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                I believe the vectors are column vectors so the terms are matrices, not scalars.
                $endgroup$
                – John Douma
                Dec 20 '18 at 6:25










              • $begingroup$
                Yes, it is an identity matrix
                $endgroup$
                – Studying Optimization
                Dec 20 '18 at 6:56














              0












              0








              0





              $begingroup$

              We have $v_j v_j^T=||v_j||^2=1$ for all $j=1,2,...,n.$ Hence $v_1v_1^T+v_2v_2^T+...+v_nv_n^T=n$.






              share|cite|improve this answer









              $endgroup$



              We have $v_j v_j^T=||v_j||^2=1$ for all $j=1,2,...,n.$ Hence $v_1v_1^T+v_2v_2^T+...+v_nv_n^T=n$.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Dec 20 '18 at 6:19









              FredFred

              47.2k1849




              47.2k1849












              • $begingroup$
                I believe the vectors are column vectors so the terms are matrices, not scalars.
                $endgroup$
                – John Douma
                Dec 20 '18 at 6:25










              • $begingroup$
                Yes, it is an identity matrix
                $endgroup$
                – Studying Optimization
                Dec 20 '18 at 6:56


















              • $begingroup$
                I believe the vectors are column vectors so the terms are matrices, not scalars.
                $endgroup$
                – John Douma
                Dec 20 '18 at 6:25










              • $begingroup$
                Yes, it is an identity matrix
                $endgroup$
                – Studying Optimization
                Dec 20 '18 at 6:56
















              $begingroup$
              I believe the vectors are column vectors so the terms are matrices, not scalars.
              $endgroup$
              – John Douma
              Dec 20 '18 at 6:25




              $begingroup$
              I believe the vectors are column vectors so the terms are matrices, not scalars.
              $endgroup$
              – John Douma
              Dec 20 '18 at 6:25












              $begingroup$
              Yes, it is an identity matrix
              $endgroup$
              – Studying Optimization
              Dec 20 '18 at 6:56




              $begingroup$
              Yes, it is an identity matrix
              $endgroup$
              – Studying Optimization
              Dec 20 '18 at 6:56











              0












              $begingroup$

              If $v_i$ is a column vector then multiply $v_1v_1^T+dots+v_nv_n^T$ by any vector $x$:
              $$
              (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)
              $$

              Since $v_i$ make an orthonormal basis, the $v_i^Tx$ is the coefficient in the decomposition of $x$:
              $x=a_1v_1+dots+a_nv_n$ implies $v_i^Tx=v_i^T(a_1v_1+dots+a_nv_n)=a_iv_i^Tv_i=a_i$. Therefore
              $$
              (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)=x
              $$

              which implies that $v_1v_1^T+dots+v_nv_n^T$ is the identity matrix.






              share|cite|improve this answer









              $endgroup$


















                0












                $begingroup$

                If $v_i$ is a column vector then multiply $v_1v_1^T+dots+v_nv_n^T$ by any vector $x$:
                $$
                (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)
                $$

                Since $v_i$ make an orthonormal basis, the $v_i^Tx$ is the coefficient in the decomposition of $x$:
                $x=a_1v_1+dots+a_nv_n$ implies $v_i^Tx=v_i^T(a_1v_1+dots+a_nv_n)=a_iv_i^Tv_i=a_i$. Therefore
                $$
                (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)=x
                $$

                which implies that $v_1v_1^T+dots+v_nv_n^T$ is the identity matrix.






                share|cite|improve this answer









                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  If $v_i$ is a column vector then multiply $v_1v_1^T+dots+v_nv_n^T$ by any vector $x$:
                  $$
                  (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)
                  $$

                  Since $v_i$ make an orthonormal basis, the $v_i^Tx$ is the coefficient in the decomposition of $x$:
                  $x=a_1v_1+dots+a_nv_n$ implies $v_i^Tx=v_i^T(a_1v_1+dots+a_nv_n)=a_iv_i^Tv_i=a_i$. Therefore
                  $$
                  (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)=x
                  $$

                  which implies that $v_1v_1^T+dots+v_nv_n^T$ is the identity matrix.






                  share|cite|improve this answer









                  $endgroup$



                  If $v_i$ is a column vector then multiply $v_1v_1^T+dots+v_nv_n^T$ by any vector $x$:
                  $$
                  (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)
                  $$

                  Since $v_i$ make an orthonormal basis, the $v_i^Tx$ is the coefficient in the decomposition of $x$:
                  $x=a_1v_1+dots+a_nv_n$ implies $v_i^Tx=v_i^T(a_1v_1+dots+a_nv_n)=a_iv_i^Tv_i=a_i$. Therefore
                  $$
                  (v_1v_1^T+dots+v_nv_n^T)x=v_1(v_1^Tx)+dots+v_n(v_n^Tx)=x
                  $$

                  which implies that $v_1v_1^T+dots+v_nv_n^T$ is the identity matrix.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 20 '18 at 7:13









                  Sergei GolovanSergei Golovan

                  1,393147




                  1,393147















                      Popular posts from this blog

                      Probability when a professor distributes a quiz and homework assignment to a class of n students.

                      Aardman Animations

                      Are they similar matrix