Singular vectors of a symmetric block secondary diagonal matrix











up vote
1
down vote

favorite












Given $A in mathbb{R}^{n times m}$, consider the symmetric matrix



$M = begin{pmatrix} 0 & A \ A^{t} & 0 end{pmatrix} in mathbb{R}^{(n+m) times (n+m)}$.



Show that a simple relationship exists between the singular vectors of $A$ and the eingenvectors of $M$. Show how to build an orthbogonal basis of $mathbb{R}^{n+m}$ consisting of eigenvectors of $M$, given the singular vectors of $A$.



My attempt: Consider the singular decomposition $A=USigma V^{t}$, where $U_{ntimes n} $, $V_{m times m}$ are orthogonal and $Sigma$ is rectangular diagonal matrix of singular values.



$displaystyle det M = det((-lambda I)(-lambda I) - AA^{t}) = det(lambda^2I - AA^{t}) = det(lambda^2I - USigma V^{t}VSigma^{t}U^{t})=det(lambda^2I - USigma^{2}U^{t}) = det begin{pmatrix}(lambda^2 - sigma_1^2 & 0 & dots & dots &0 \0 & lambda^2 - sigma_2^2 & 0 & dots & 0 \ vdots & ddots & ddots & ddots& vdots \ 0 & dots & dots & 0 & lambda^2 - sigma_n^2 end{pmatrix}=prod_{k=1}^{n} (lambda^2 - sigma_k^2)=0 implies lambda = pm sigma_k$



for some $k in{1,...,n}$



Let $v_k$ be the eigenvector associated to $lambda$.



$Mv_k = lambda v_k = pm sigma_k v_k $, and $sigma_kv_k$ is equal to either $Av_k, A^{t}v_k$ or $0$ (By a theorem in the book), depending on which index $k$.



I'm insecure, specially at the last part, it doesn't feel right. Also, I'd like to some orientation on how to build this basis of orthogonal eigenvectors. I know Gram-schmidt, but not when singular vectors are involved...



Please verify what I did and show me how to improve, or maybe start all over...



Thanks.










share|cite|improve this question
























  • If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
    – dude3221
    Nov 12 at 19:26

















up vote
1
down vote

favorite












Given $A in mathbb{R}^{n times m}$, consider the symmetric matrix



$M = begin{pmatrix} 0 & A \ A^{t} & 0 end{pmatrix} in mathbb{R}^{(n+m) times (n+m)}$.



Show that a simple relationship exists between the singular vectors of $A$ and the eingenvectors of $M$. Show how to build an orthbogonal basis of $mathbb{R}^{n+m}$ consisting of eigenvectors of $M$, given the singular vectors of $A$.



My attempt: Consider the singular decomposition $A=USigma V^{t}$, where $U_{ntimes n} $, $V_{m times m}$ are orthogonal and $Sigma$ is rectangular diagonal matrix of singular values.



$displaystyle det M = det((-lambda I)(-lambda I) - AA^{t}) = det(lambda^2I - AA^{t}) = det(lambda^2I - USigma V^{t}VSigma^{t}U^{t})=det(lambda^2I - USigma^{2}U^{t}) = det begin{pmatrix}(lambda^2 - sigma_1^2 & 0 & dots & dots &0 \0 & lambda^2 - sigma_2^2 & 0 & dots & 0 \ vdots & ddots & ddots & ddots& vdots \ 0 & dots & dots & 0 & lambda^2 - sigma_n^2 end{pmatrix}=prod_{k=1}^{n} (lambda^2 - sigma_k^2)=0 implies lambda = pm sigma_k$



for some $k in{1,...,n}$



Let $v_k$ be the eigenvector associated to $lambda$.



$Mv_k = lambda v_k = pm sigma_k v_k $, and $sigma_kv_k$ is equal to either $Av_k, A^{t}v_k$ or $0$ (By a theorem in the book), depending on which index $k$.



I'm insecure, specially at the last part, it doesn't feel right. Also, I'd like to some orientation on how to build this basis of orthogonal eigenvectors. I know Gram-schmidt, but not when singular vectors are involved...



Please verify what I did and show me how to improve, or maybe start all over...



Thanks.










share|cite|improve this question
























  • If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
    – dude3221
    Nov 12 at 19:26















up vote
1
down vote

favorite









up vote
1
down vote

favorite











Given $A in mathbb{R}^{n times m}$, consider the symmetric matrix



$M = begin{pmatrix} 0 & A \ A^{t} & 0 end{pmatrix} in mathbb{R}^{(n+m) times (n+m)}$.



Show that a simple relationship exists between the singular vectors of $A$ and the eingenvectors of $M$. Show how to build an orthbogonal basis of $mathbb{R}^{n+m}$ consisting of eigenvectors of $M$, given the singular vectors of $A$.



My attempt: Consider the singular decomposition $A=USigma V^{t}$, where $U_{ntimes n} $, $V_{m times m}$ are orthogonal and $Sigma$ is rectangular diagonal matrix of singular values.



$displaystyle det M = det((-lambda I)(-lambda I) - AA^{t}) = det(lambda^2I - AA^{t}) = det(lambda^2I - USigma V^{t}VSigma^{t}U^{t})=det(lambda^2I - USigma^{2}U^{t}) = det begin{pmatrix}(lambda^2 - sigma_1^2 & 0 & dots & dots &0 \0 & lambda^2 - sigma_2^2 & 0 & dots & 0 \ vdots & ddots & ddots & ddots& vdots \ 0 & dots & dots & 0 & lambda^2 - sigma_n^2 end{pmatrix}=prod_{k=1}^{n} (lambda^2 - sigma_k^2)=0 implies lambda = pm sigma_k$



for some $k in{1,...,n}$



Let $v_k$ be the eigenvector associated to $lambda$.



$Mv_k = lambda v_k = pm sigma_k v_k $, and $sigma_kv_k$ is equal to either $Av_k, A^{t}v_k$ or $0$ (By a theorem in the book), depending on which index $k$.



I'm insecure, specially at the last part, it doesn't feel right. Also, I'd like to some orientation on how to build this basis of orthogonal eigenvectors. I know Gram-schmidt, but not when singular vectors are involved...



Please verify what I did and show me how to improve, or maybe start all over...



Thanks.










share|cite|improve this question















Given $A in mathbb{R}^{n times m}$, consider the symmetric matrix



$M = begin{pmatrix} 0 & A \ A^{t} & 0 end{pmatrix} in mathbb{R}^{(n+m) times (n+m)}$.



Show that a simple relationship exists between the singular vectors of $A$ and the eingenvectors of $M$. Show how to build an orthbogonal basis of $mathbb{R}^{n+m}$ consisting of eigenvectors of $M$, given the singular vectors of $A$.



My attempt: Consider the singular decomposition $A=USigma V^{t}$, where $U_{ntimes n} $, $V_{m times m}$ are orthogonal and $Sigma$ is rectangular diagonal matrix of singular values.



$displaystyle det M = det((-lambda I)(-lambda I) - AA^{t}) = det(lambda^2I - AA^{t}) = det(lambda^2I - USigma V^{t}VSigma^{t}U^{t})=det(lambda^2I - USigma^{2}U^{t}) = det begin{pmatrix}(lambda^2 - sigma_1^2 & 0 & dots & dots &0 \0 & lambda^2 - sigma_2^2 & 0 & dots & 0 \ vdots & ddots & ddots & ddots& vdots \ 0 & dots & dots & 0 & lambda^2 - sigma_n^2 end{pmatrix}=prod_{k=1}^{n} (lambda^2 - sigma_k^2)=0 implies lambda = pm sigma_k$



for some $k in{1,...,n}$



Let $v_k$ be the eigenvector associated to $lambda$.



$Mv_k = lambda v_k = pm sigma_k v_k $, and $sigma_kv_k$ is equal to either $Av_k, A^{t}v_k$ or $0$ (By a theorem in the book), depending on which index $k$.



I'm insecure, specially at the last part, it doesn't feel right. Also, I'd like to some orientation on how to build this basis of orthogonal eigenvectors. I know Gram-schmidt, but not when singular vectors are involved...



Please verify what I did and show me how to improve, or maybe start all over...



Thanks.







matrices eigenvalues-eigenvectors matrix-decomposition singularvalues






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 13 at 14:09

























asked Nov 12 at 19:23









dude3221

46913




46913












  • If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
    – dude3221
    Nov 12 at 19:26




















  • If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
    – dude3221
    Nov 12 at 19:26


















If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
– dude3221
Nov 12 at 19:26






If it is said $M$ symmetric then do I have $A$ also symmetric? Or something like that (A is not a square matrix...)
– dude3221
Nov 12 at 19:26












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










I would start the same way you did: let $A = U Sigma V^t$ be the SVD of $A$. Then, the SVD of $A^t$ is given by $A^t = V Sigma^t U^t$. Inserting into $M$:



$$M = begin{bmatrix} 0_{n times n} & U Sigma V^t \ V Sigma^t U^t & 0_{m times m} end{bmatrix}.$$



Now, let us try the eigenvectors $q_k = [u_k^t, pm v_k^t]^t$. Inserting into $M$ yields



$$M cdot q_k = begin{bmatrix} U Sigma V^t v_k \ V Sigma^t U^t u_k end{bmatrix} =begin{bmatrix} sigma_k u_k \ pm sigma_k v_k end{bmatrix} = sigma_k q_k.$$



This shows that $q_k$ are eigenvectors. Note that for $nneq m$ you will also get some zero eigenvalues. This is due to the fact that then either $A$ or $A^t$ will have a non-empty kernel and thus we can find nonzero vectors that give $M cdot q = 0$.






share|cite|improve this answer























  • I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
    – dude3221
    Nov 13 at 14:05










  • Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
    – Florian
    Nov 13 at 15:09












  • I edited my answer to correct my error.
    – Florian
    Nov 13 at 15:21










  • It's not your fault, I indeed typed incorrectly before and edited. Thanks.
    – dude3221
    Nov 13 at 16:07











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995734%2fsingular-vectors-of-a-symmetric-block-secondary-diagonal-matrix%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










I would start the same way you did: let $A = U Sigma V^t$ be the SVD of $A$. Then, the SVD of $A^t$ is given by $A^t = V Sigma^t U^t$. Inserting into $M$:



$$M = begin{bmatrix} 0_{n times n} & U Sigma V^t \ V Sigma^t U^t & 0_{m times m} end{bmatrix}.$$



Now, let us try the eigenvectors $q_k = [u_k^t, pm v_k^t]^t$. Inserting into $M$ yields



$$M cdot q_k = begin{bmatrix} U Sigma V^t v_k \ V Sigma^t U^t u_k end{bmatrix} =begin{bmatrix} sigma_k u_k \ pm sigma_k v_k end{bmatrix} = sigma_k q_k.$$



This shows that $q_k$ are eigenvectors. Note that for $nneq m$ you will also get some zero eigenvalues. This is due to the fact that then either $A$ or $A^t$ will have a non-empty kernel and thus we can find nonzero vectors that give $M cdot q = 0$.






share|cite|improve this answer























  • I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
    – dude3221
    Nov 13 at 14:05










  • Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
    – Florian
    Nov 13 at 15:09












  • I edited my answer to correct my error.
    – Florian
    Nov 13 at 15:21










  • It's not your fault, I indeed typed incorrectly before and edited. Thanks.
    – dude3221
    Nov 13 at 16:07















up vote
1
down vote



accepted










I would start the same way you did: let $A = U Sigma V^t$ be the SVD of $A$. Then, the SVD of $A^t$ is given by $A^t = V Sigma^t U^t$. Inserting into $M$:



$$M = begin{bmatrix} 0_{n times n} & U Sigma V^t \ V Sigma^t U^t & 0_{m times m} end{bmatrix}.$$



Now, let us try the eigenvectors $q_k = [u_k^t, pm v_k^t]^t$. Inserting into $M$ yields



$$M cdot q_k = begin{bmatrix} U Sigma V^t v_k \ V Sigma^t U^t u_k end{bmatrix} =begin{bmatrix} sigma_k u_k \ pm sigma_k v_k end{bmatrix} = sigma_k q_k.$$



This shows that $q_k$ are eigenvectors. Note that for $nneq m$ you will also get some zero eigenvalues. This is due to the fact that then either $A$ or $A^t$ will have a non-empty kernel and thus we can find nonzero vectors that give $M cdot q = 0$.






share|cite|improve this answer























  • I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
    – dude3221
    Nov 13 at 14:05










  • Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
    – Florian
    Nov 13 at 15:09












  • I edited my answer to correct my error.
    – Florian
    Nov 13 at 15:21










  • It's not your fault, I indeed typed incorrectly before and edited. Thanks.
    – dude3221
    Nov 13 at 16:07













up vote
1
down vote



accepted







up vote
1
down vote



accepted






I would start the same way you did: let $A = U Sigma V^t$ be the SVD of $A$. Then, the SVD of $A^t$ is given by $A^t = V Sigma^t U^t$. Inserting into $M$:



$$M = begin{bmatrix} 0_{n times n} & U Sigma V^t \ V Sigma^t U^t & 0_{m times m} end{bmatrix}.$$



Now, let us try the eigenvectors $q_k = [u_k^t, pm v_k^t]^t$. Inserting into $M$ yields



$$M cdot q_k = begin{bmatrix} U Sigma V^t v_k \ V Sigma^t U^t u_k end{bmatrix} =begin{bmatrix} sigma_k u_k \ pm sigma_k v_k end{bmatrix} = sigma_k q_k.$$



This shows that $q_k$ are eigenvectors. Note that for $nneq m$ you will also get some zero eigenvalues. This is due to the fact that then either $A$ or $A^t$ will have a non-empty kernel and thus we can find nonzero vectors that give $M cdot q = 0$.






share|cite|improve this answer














I would start the same way you did: let $A = U Sigma V^t$ be the SVD of $A$. Then, the SVD of $A^t$ is given by $A^t = V Sigma^t U^t$. Inserting into $M$:



$$M = begin{bmatrix} 0_{n times n} & U Sigma V^t \ V Sigma^t U^t & 0_{m times m} end{bmatrix}.$$



Now, let us try the eigenvectors $q_k = [u_k^t, pm v_k^t]^t$. Inserting into $M$ yields



$$M cdot q_k = begin{bmatrix} U Sigma V^t v_k \ V Sigma^t U^t u_k end{bmatrix} =begin{bmatrix} sigma_k u_k \ pm sigma_k v_k end{bmatrix} = sigma_k q_k.$$



This shows that $q_k$ are eigenvectors. Note that for $nneq m$ you will also get some zero eigenvalues. This is due to the fact that then either $A$ or $A^t$ will have a non-empty kernel and thus we can find nonzero vectors that give $M cdot q = 0$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 13 at 15:21

























answered Nov 13 at 9:13









Florian

1,3381719




1,3381719












  • I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
    – dude3221
    Nov 13 at 14:05










  • Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
    – Florian
    Nov 13 at 15:09












  • I edited my answer to correct my error.
    – Florian
    Nov 13 at 15:21










  • It's not your fault, I indeed typed incorrectly before and edited. Thanks.
    – dude3221
    Nov 13 at 16:07


















  • I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
    – dude3221
    Nov 13 at 14:05










  • Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
    – Florian
    Nov 13 at 15:09












  • I edited my answer to correct my error.
    – Florian
    Nov 13 at 15:21










  • It's not your fault, I indeed typed incorrectly before and edited. Thanks.
    – dude3221
    Nov 13 at 16:07
















I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
– dude3221
Nov 13 at 14:05




I think I get what you did in this decomposition, but what can I conclude about the relation between the eigenvectors of $M$ and the singular vectors of $A$ by it?
– dude3221
Nov 13 at 14:05












Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
– Florian
Nov 13 at 15:09






Oh. My bad! I somehow misread your question and assumed that you are looking for a connection to the singular vectors of $M$. My apologies.
– Florian
Nov 13 at 15:09














I edited my answer to correct my error.
– Florian
Nov 13 at 15:21




I edited my answer to correct my error.
– Florian
Nov 13 at 15:21












It's not your fault, I indeed typed incorrectly before and edited. Thanks.
– dude3221
Nov 13 at 16:07




It's not your fault, I indeed typed incorrectly before and edited. Thanks.
– dude3221
Nov 13 at 16:07


















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995734%2fsingular-vectors-of-a-symmetric-block-secondary-diagonal-matrix%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix