Decomposing a Vector Space into a direct sum of Generalized Eigenspaces











up vote
2
down vote

favorite












I've seen a proof that doesn't involve any induction, but now I am trying to do this inductively since it is bothering me.



If $T$ is a linear operator over a finite-dimensional vector space $V$ and the characteristic polynomial $phi(t)$ of $T$ factors into
begin{gather}
phi(t) = prod^n_{i = 1}(t - lambda_i)^{m_i}
end{gather}

where each $lambda_i$ is an eigenvalue of $T.$ Then for each generalized eigenspace $K_{lambda_i}$ we have that
begin{gather}
V = bigoplus^n_{i = 1}K_{lambda_i}.
end{gather}

My idea was to use a base case of two eigenvalues. Our base case will show that when $phi(t) = (t - lambda_1)^{m_1}(t - lambda_2)^{m_2}$ we have that. If we consider $f(t) = (t - lambda_1)^{m_1}, g(t) = (t - lambda_2)^{m_2},$ we see that the following comes from the fact that $f,g$ are both co-prime. We have that
begin{gather}
V = N(f(t)) oplus R(f(t)).
end{gather}

Additionally, we note that $N(f(t)) = K_{lambda_1}$ and we have that $R(f(t)) = N(g(t)) = K_{lambda_2}.$ So we have that
begin{gather}
V = K_{lambda_1} oplus K_{lambda_2}.
end{gather}

So, this is where I am not sure where to continue. I don't know what to suppose as my inductive hypothesis. My ultimate goal is to express the final polynomial
begin{gather}
phi(t) = (t - lambda_1)^{m_1} cdots (t - lambda_n)^{m_n}.
end{gather}

we can express one polynomial as $f(t) = (t - lambda_1)^{m_1}$ and $g(t) = (t - lambda_2)^{m_2} cdots (t - lambda_n)^{m_n}.$ Then with the discussion above argue that
begin{gather}
R(f(t)) = bigoplus^n_{i = 2} K_{lambda_i}.
end{gather}

and then we will have that $V$ is a direct sum of these generalized eigenspaces. But I am not sure what the inductive step here is.










share|cite|improve this question






















  • My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
    – AnyAD
    Nov 13 at 22:27















up vote
2
down vote

favorite












I've seen a proof that doesn't involve any induction, but now I am trying to do this inductively since it is bothering me.



If $T$ is a linear operator over a finite-dimensional vector space $V$ and the characteristic polynomial $phi(t)$ of $T$ factors into
begin{gather}
phi(t) = prod^n_{i = 1}(t - lambda_i)^{m_i}
end{gather}

where each $lambda_i$ is an eigenvalue of $T.$ Then for each generalized eigenspace $K_{lambda_i}$ we have that
begin{gather}
V = bigoplus^n_{i = 1}K_{lambda_i}.
end{gather}

My idea was to use a base case of two eigenvalues. Our base case will show that when $phi(t) = (t - lambda_1)^{m_1}(t - lambda_2)^{m_2}$ we have that. If we consider $f(t) = (t - lambda_1)^{m_1}, g(t) = (t - lambda_2)^{m_2},$ we see that the following comes from the fact that $f,g$ are both co-prime. We have that
begin{gather}
V = N(f(t)) oplus R(f(t)).
end{gather}

Additionally, we note that $N(f(t)) = K_{lambda_1}$ and we have that $R(f(t)) = N(g(t)) = K_{lambda_2}.$ So we have that
begin{gather}
V = K_{lambda_1} oplus K_{lambda_2}.
end{gather}

So, this is where I am not sure where to continue. I don't know what to suppose as my inductive hypothesis. My ultimate goal is to express the final polynomial
begin{gather}
phi(t) = (t - lambda_1)^{m_1} cdots (t - lambda_n)^{m_n}.
end{gather}

we can express one polynomial as $f(t) = (t - lambda_1)^{m_1}$ and $g(t) = (t - lambda_2)^{m_2} cdots (t - lambda_n)^{m_n}.$ Then with the discussion above argue that
begin{gather}
R(f(t)) = bigoplus^n_{i = 2} K_{lambda_i}.
end{gather}

and then we will have that $V$ is a direct sum of these generalized eigenspaces. But I am not sure what the inductive step here is.










share|cite|improve this question






















  • My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
    – AnyAD
    Nov 13 at 22:27













up vote
2
down vote

favorite









up vote
2
down vote

favorite











I've seen a proof that doesn't involve any induction, but now I am trying to do this inductively since it is bothering me.



If $T$ is a linear operator over a finite-dimensional vector space $V$ and the characteristic polynomial $phi(t)$ of $T$ factors into
begin{gather}
phi(t) = prod^n_{i = 1}(t - lambda_i)^{m_i}
end{gather}

where each $lambda_i$ is an eigenvalue of $T.$ Then for each generalized eigenspace $K_{lambda_i}$ we have that
begin{gather}
V = bigoplus^n_{i = 1}K_{lambda_i}.
end{gather}

My idea was to use a base case of two eigenvalues. Our base case will show that when $phi(t) = (t - lambda_1)^{m_1}(t - lambda_2)^{m_2}$ we have that. If we consider $f(t) = (t - lambda_1)^{m_1}, g(t) = (t - lambda_2)^{m_2},$ we see that the following comes from the fact that $f,g$ are both co-prime. We have that
begin{gather}
V = N(f(t)) oplus R(f(t)).
end{gather}

Additionally, we note that $N(f(t)) = K_{lambda_1}$ and we have that $R(f(t)) = N(g(t)) = K_{lambda_2}.$ So we have that
begin{gather}
V = K_{lambda_1} oplus K_{lambda_2}.
end{gather}

So, this is where I am not sure where to continue. I don't know what to suppose as my inductive hypothesis. My ultimate goal is to express the final polynomial
begin{gather}
phi(t) = (t - lambda_1)^{m_1} cdots (t - lambda_n)^{m_n}.
end{gather}

we can express one polynomial as $f(t) = (t - lambda_1)^{m_1}$ and $g(t) = (t - lambda_2)^{m_2} cdots (t - lambda_n)^{m_n}.$ Then with the discussion above argue that
begin{gather}
R(f(t)) = bigoplus^n_{i = 2} K_{lambda_i}.
end{gather}

and then we will have that $V$ is a direct sum of these generalized eigenspaces. But I am not sure what the inductive step here is.










share|cite|improve this question













I've seen a proof that doesn't involve any induction, but now I am trying to do this inductively since it is bothering me.



If $T$ is a linear operator over a finite-dimensional vector space $V$ and the characteristic polynomial $phi(t)$ of $T$ factors into
begin{gather}
phi(t) = prod^n_{i = 1}(t - lambda_i)^{m_i}
end{gather}

where each $lambda_i$ is an eigenvalue of $T.$ Then for each generalized eigenspace $K_{lambda_i}$ we have that
begin{gather}
V = bigoplus^n_{i = 1}K_{lambda_i}.
end{gather}

My idea was to use a base case of two eigenvalues. Our base case will show that when $phi(t) = (t - lambda_1)^{m_1}(t - lambda_2)^{m_2}$ we have that. If we consider $f(t) = (t - lambda_1)^{m_1}, g(t) = (t - lambda_2)^{m_2},$ we see that the following comes from the fact that $f,g$ are both co-prime. We have that
begin{gather}
V = N(f(t)) oplus R(f(t)).
end{gather}

Additionally, we note that $N(f(t)) = K_{lambda_1}$ and we have that $R(f(t)) = N(g(t)) = K_{lambda_2}.$ So we have that
begin{gather}
V = K_{lambda_1} oplus K_{lambda_2}.
end{gather}

So, this is where I am not sure where to continue. I don't know what to suppose as my inductive hypothesis. My ultimate goal is to express the final polynomial
begin{gather}
phi(t) = (t - lambda_1)^{m_1} cdots (t - lambda_n)^{m_n}.
end{gather}

we can express one polynomial as $f(t) = (t - lambda_1)^{m_1}$ and $g(t) = (t - lambda_2)^{m_2} cdots (t - lambda_n)^{m_n}.$ Then with the discussion above argue that
begin{gather}
R(f(t)) = bigoplus^n_{i = 2} K_{lambda_i}.
end{gather}

and then we will have that $V$ is a direct sum of these generalized eigenspaces. But I am not sure what the inductive step here is.







linear-algebra






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 13 at 22:11









Josabanks

434




434












  • My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
    – AnyAD
    Nov 13 at 22:27


















  • My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
    – AnyAD
    Nov 13 at 22:27
















My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
– AnyAD
Nov 13 at 22:27




My feeling here is that you would start with/show $V=K_{lambda_{1}}oplus S$, for some $S$ and then proceed to decompose $S$ .
– AnyAD
Nov 13 at 22:27















active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997413%2fdecomposing-a-vector-space-into-a-direct-sum-of-generalized-eigenspaces%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2997413%2fdecomposing-a-vector-space-into-a-direct-sum-of-generalized-eigenspaces%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix