Quickly calculating an orthogonal matrix & Spectral Theorem











up vote
0
down vote

favorite












I was trying to find a quicker way to identify whether a set of vectors was an orthogonal set or not. I discovered something interesting, but I don't understand how to put this intuition into words.




Let S = a set of 3 vectors, which I combined into a $M_{33}$




$

S
=
begin{bmatrix}
3\1\1
end{bmatrix}
,
begin{bmatrix}
-1\2\1
end{bmatrix}
,
begin{bmatrix}
-.5\-2\3.5
end{bmatrix}
=
begin{bmatrix}
3&-1&-.5\1&2&-2\1&1&3.5
end{bmatrix}
\$




$S^T$*S = D, a diagonal matrix




D =
$begin{bmatrix}
11&0&0\0&6&0\0&0&16.5
end{bmatrix}$



I noticed that this is a pattern between all orthogonal square matrices. I also noticed that this may hold up for non-square matrices as well. It seems that this has something to do with eigenvalues and the spectral theorem, but I only briefly learned how to calculate eigenvalues and eigenvectors just yesterday yesterday, and we aren't going to touch this 'spectral theorem' thing until the end of the semester.



My questions are:



1) If you could sum up briefly, how would you describe what is happening?



2) Does my theory hold up for non-square matrices as well? My proofing skills are not strong enough to walk through this by myself.










share|cite|improve this question






















  • Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
    – aghostinthefigures
    Nov 21 at 20:35










  • Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
    – amd
    Nov 21 at 20:57












  • Think about what the elements of $S^TS$ are in terms of the columns of $S$.
    – amd
    Nov 21 at 20:58










  • @amd That would be "orthonormal", wouldn't it?
    – user58697
    Nov 21 at 21:34










  • @user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
    – amd
    Nov 21 at 21:35

















up vote
0
down vote

favorite












I was trying to find a quicker way to identify whether a set of vectors was an orthogonal set or not. I discovered something interesting, but I don't understand how to put this intuition into words.




Let S = a set of 3 vectors, which I combined into a $M_{33}$




$

S
=
begin{bmatrix}
3\1\1
end{bmatrix}
,
begin{bmatrix}
-1\2\1
end{bmatrix}
,
begin{bmatrix}
-.5\-2\3.5
end{bmatrix}
=
begin{bmatrix}
3&-1&-.5\1&2&-2\1&1&3.5
end{bmatrix}
\$




$S^T$*S = D, a diagonal matrix




D =
$begin{bmatrix}
11&0&0\0&6&0\0&0&16.5
end{bmatrix}$



I noticed that this is a pattern between all orthogonal square matrices. I also noticed that this may hold up for non-square matrices as well. It seems that this has something to do with eigenvalues and the spectral theorem, but I only briefly learned how to calculate eigenvalues and eigenvectors just yesterday yesterday, and we aren't going to touch this 'spectral theorem' thing until the end of the semester.



My questions are:



1) If you could sum up briefly, how would you describe what is happening?



2) Does my theory hold up for non-square matrices as well? My proofing skills are not strong enough to walk through this by myself.










share|cite|improve this question






















  • Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
    – aghostinthefigures
    Nov 21 at 20:35










  • Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
    – amd
    Nov 21 at 20:57












  • Think about what the elements of $S^TS$ are in terms of the columns of $S$.
    – amd
    Nov 21 at 20:58










  • @amd That would be "orthonormal", wouldn't it?
    – user58697
    Nov 21 at 21:34










  • @user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
    – amd
    Nov 21 at 21:35















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I was trying to find a quicker way to identify whether a set of vectors was an orthogonal set or not. I discovered something interesting, but I don't understand how to put this intuition into words.




Let S = a set of 3 vectors, which I combined into a $M_{33}$




$

S
=
begin{bmatrix}
3\1\1
end{bmatrix}
,
begin{bmatrix}
-1\2\1
end{bmatrix}
,
begin{bmatrix}
-.5\-2\3.5
end{bmatrix}
=
begin{bmatrix}
3&-1&-.5\1&2&-2\1&1&3.5
end{bmatrix}
\$




$S^T$*S = D, a diagonal matrix




D =
$begin{bmatrix}
11&0&0\0&6&0\0&0&16.5
end{bmatrix}$



I noticed that this is a pattern between all orthogonal square matrices. I also noticed that this may hold up for non-square matrices as well. It seems that this has something to do with eigenvalues and the spectral theorem, but I only briefly learned how to calculate eigenvalues and eigenvectors just yesterday yesterday, and we aren't going to touch this 'spectral theorem' thing until the end of the semester.



My questions are:



1) If you could sum up briefly, how would you describe what is happening?



2) Does my theory hold up for non-square matrices as well? My proofing skills are not strong enough to walk through this by myself.










share|cite|improve this question













I was trying to find a quicker way to identify whether a set of vectors was an orthogonal set or not. I discovered something interesting, but I don't understand how to put this intuition into words.




Let S = a set of 3 vectors, which I combined into a $M_{33}$




$

S
=
begin{bmatrix}
3\1\1
end{bmatrix}
,
begin{bmatrix}
-1\2\1
end{bmatrix}
,
begin{bmatrix}
-.5\-2\3.5
end{bmatrix}
=
begin{bmatrix}
3&-1&-.5\1&2&-2\1&1&3.5
end{bmatrix}
\$




$S^T$*S = D, a diagonal matrix




D =
$begin{bmatrix}
11&0&0\0&6&0\0&0&16.5
end{bmatrix}$



I noticed that this is a pattern between all orthogonal square matrices. I also noticed that this may hold up for non-square matrices as well. It seems that this has something to do with eigenvalues and the spectral theorem, but I only briefly learned how to calculate eigenvalues and eigenvectors just yesterday yesterday, and we aren't going to touch this 'spectral theorem' thing until the end of the semester.



My questions are:



1) If you could sum up briefly, how would you describe what is happening?



2) Does my theory hold up for non-square matrices as well? My proofing skills are not strong enough to walk through this by myself.







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 21 at 20:02









Evan Kim

758




758












  • Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
    – aghostinthefigures
    Nov 21 at 20:35










  • Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
    – amd
    Nov 21 at 20:57












  • Think about what the elements of $S^TS$ are in terms of the columns of $S$.
    – amd
    Nov 21 at 20:58










  • @amd That would be "orthonormal", wouldn't it?
    – user58697
    Nov 21 at 21:34










  • @user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
    – amd
    Nov 21 at 21:35




















  • Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
    – aghostinthefigures
    Nov 21 at 20:35










  • Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
    – amd
    Nov 21 at 20:57












  • Think about what the elements of $S^TS$ are in terms of the columns of $S$.
    – amd
    Nov 21 at 20:58










  • @amd That would be "orthonormal", wouldn't it?
    – user58697
    Nov 21 at 21:34










  • @user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
    – amd
    Nov 21 at 21:35


















Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
– aghostinthefigures
Nov 21 at 20:35




Is your claim that, if you construct a matrix $S$ out of orthogonal vectors as above, you'll find that $S^T S$ is a diagonal matrix? (If so, you might want to state that explicitly.)
– aghostinthefigures
Nov 21 at 20:35












Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
– amd
Nov 21 at 20:57






Be careful with your terminology: in usual usage, an “orthogonal” matrix $M$ is one for which $M^TM=MM^T=I$, which is a stronger condition than simply having orthogonal columns.
– amd
Nov 21 at 20:57














Think about what the elements of $S^TS$ are in terms of the columns of $S$.
– amd
Nov 21 at 20:58




Think about what the elements of $S^TS$ are in terms of the columns of $S$.
– amd
Nov 21 at 20:58












@amd That would be "orthonormal", wouldn't it?
– user58697
Nov 21 at 21:34




@amd That would be "orthonormal", wouldn't it?
– user58697
Nov 21 at 21:34












@user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
– amd
Nov 21 at 21:35






@user58697 No. It’s an unfortunate bit of terminology. The rows/columns of an orthogonal matrix form an orthonormal set of vectors.
– amd
Nov 21 at 21:35












1 Answer
1






active

oldest

votes

















up vote
0
down vote













Let’s go back to the definition of matrix multiplication. The elements of the product $C=AB$ are given by the formula $c_{ij}=sum_k a_{ik}b_{kj}$. That is, the $ij$-th element of the product of two matrices is the dot product of the $i$th row of the first matrix with the $j$th row of the second.



Now, the rows of $S^T$ are the columns of $S$, so $S^TS$ consists of all of the pairwise dot products of the columns of $S$. Thus, its diagonal elements are the squares of the norms of those columns, and the off-diagonal elements are zero iff the corresponding pair of columns is orthogonal. None of this requires that $S$ be square, but the product $S^TS$ will always be a square matrix.






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3008268%2fquickly-calculating-an-orthogonal-matrix-spectral-theorem%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote













    Let’s go back to the definition of matrix multiplication. The elements of the product $C=AB$ are given by the formula $c_{ij}=sum_k a_{ik}b_{kj}$. That is, the $ij$-th element of the product of two matrices is the dot product of the $i$th row of the first matrix with the $j$th row of the second.



    Now, the rows of $S^T$ are the columns of $S$, so $S^TS$ consists of all of the pairwise dot products of the columns of $S$. Thus, its diagonal elements are the squares of the norms of those columns, and the off-diagonal elements are zero iff the corresponding pair of columns is orthogonal. None of this requires that $S$ be square, but the product $S^TS$ will always be a square matrix.






    share|cite|improve this answer

























      up vote
      0
      down vote













      Let’s go back to the definition of matrix multiplication. The elements of the product $C=AB$ are given by the formula $c_{ij}=sum_k a_{ik}b_{kj}$. That is, the $ij$-th element of the product of two matrices is the dot product of the $i$th row of the first matrix with the $j$th row of the second.



      Now, the rows of $S^T$ are the columns of $S$, so $S^TS$ consists of all of the pairwise dot products of the columns of $S$. Thus, its diagonal elements are the squares of the norms of those columns, and the off-diagonal elements are zero iff the corresponding pair of columns is orthogonal. None of this requires that $S$ be square, but the product $S^TS$ will always be a square matrix.






      share|cite|improve this answer























        up vote
        0
        down vote










        up vote
        0
        down vote









        Let’s go back to the definition of matrix multiplication. The elements of the product $C=AB$ are given by the formula $c_{ij}=sum_k a_{ik}b_{kj}$. That is, the $ij$-th element of the product of two matrices is the dot product of the $i$th row of the first matrix with the $j$th row of the second.



        Now, the rows of $S^T$ are the columns of $S$, so $S^TS$ consists of all of the pairwise dot products of the columns of $S$. Thus, its diagonal elements are the squares of the norms of those columns, and the off-diagonal elements are zero iff the corresponding pair of columns is orthogonal. None of this requires that $S$ be square, but the product $S^TS$ will always be a square matrix.






        share|cite|improve this answer












        Let’s go back to the definition of matrix multiplication. The elements of the product $C=AB$ are given by the formula $c_{ij}=sum_k a_{ik}b_{kj}$. That is, the $ij$-th element of the product of two matrices is the dot product of the $i$th row of the first matrix with the $j$th row of the second.



        Now, the rows of $S^T$ are the columns of $S$, so $S^TS$ consists of all of the pairwise dot products of the columns of $S$. Thus, its diagonal elements are the squares of the norms of those columns, and the off-diagonal elements are zero iff the corresponding pair of columns is orthogonal. None of this requires that $S$ be square, but the product $S^TS$ will always be a square matrix.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 21 at 21:48









        amd

        28.8k21049




        28.8k21049






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3008268%2fquickly-calculating-an-orthogonal-matrix-spectral-theorem%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How do I know what Microsoft account the skydrive app is syncing to?

            When does type information flow backwards in C++?

            Grease: Live!