Showing that any square matrix in $mathbb{R}^{n times n}$ has a “square root”
$begingroup$
Prove that there exists $delta > 0$ so that for all square matrices $Ain mathbb{R}^{ntimes n}$ with $|A-I| < delta$ (where $I$ denotes the identity matrix) there exists $Bin mathbb{R}^{ntimes n}$ so that $B^2=A$.
My attempt so far:
$$A-I= begin{bmatrix}
a_{11}-1 & a_{12} & ... & a_{1n} \
a_{21} & a_{22}-1 & ... & a_{2n} \
vdots \
a_{n1} & a_{n2} & ... & a_{nn}-1 \
end{bmatrix} $$
Taking $x=(1,0,...,0)in mathbb{R}^n$, we have that
$$|A-I|_{op}=underset{|x|=1}{sup}|(A-I)x|= sqrt{(a_{11}-1)^2+...+a_{n1}^2}<delta$$
Intuitively, this seems to suggest that for each basis vector of $mathbb{R}^n$ means that $A-I$ can be made close to the zero matrix, and hence, close to being a diagonal matrix. But I am having trouble going from here. Anyone have any hints?
real-analysis linear-algebra matrices
$endgroup$
add a comment |
$begingroup$
Prove that there exists $delta > 0$ so that for all square matrices $Ain mathbb{R}^{ntimes n}$ with $|A-I| < delta$ (where $I$ denotes the identity matrix) there exists $Bin mathbb{R}^{ntimes n}$ so that $B^2=A$.
My attempt so far:
$$A-I= begin{bmatrix}
a_{11}-1 & a_{12} & ... & a_{1n} \
a_{21} & a_{22}-1 & ... & a_{2n} \
vdots \
a_{n1} & a_{n2} & ... & a_{nn}-1 \
end{bmatrix} $$
Taking $x=(1,0,...,0)in mathbb{R}^n$, we have that
$$|A-I|_{op}=underset{|x|=1}{sup}|(A-I)x|= sqrt{(a_{11}-1)^2+...+a_{n1}^2}<delta$$
Intuitively, this seems to suggest that for each basis vector of $mathbb{R}^n$ means that $A-I$ can be made close to the zero matrix, and hence, close to being a diagonal matrix. But I am having trouble going from here. Anyone have any hints?
real-analysis linear-algebra matrices
$endgroup$
add a comment |
$begingroup$
Prove that there exists $delta > 0$ so that for all square matrices $Ain mathbb{R}^{ntimes n}$ with $|A-I| < delta$ (where $I$ denotes the identity matrix) there exists $Bin mathbb{R}^{ntimes n}$ so that $B^2=A$.
My attempt so far:
$$A-I= begin{bmatrix}
a_{11}-1 & a_{12} & ... & a_{1n} \
a_{21} & a_{22}-1 & ... & a_{2n} \
vdots \
a_{n1} & a_{n2} & ... & a_{nn}-1 \
end{bmatrix} $$
Taking $x=(1,0,...,0)in mathbb{R}^n$, we have that
$$|A-I|_{op}=underset{|x|=1}{sup}|(A-I)x|= sqrt{(a_{11}-1)^2+...+a_{n1}^2}<delta$$
Intuitively, this seems to suggest that for each basis vector of $mathbb{R}^n$ means that $A-I$ can be made close to the zero matrix, and hence, close to being a diagonal matrix. But I am having trouble going from here. Anyone have any hints?
real-analysis linear-algebra matrices
$endgroup$
Prove that there exists $delta > 0$ so that for all square matrices $Ain mathbb{R}^{ntimes n}$ with $|A-I| < delta$ (where $I$ denotes the identity matrix) there exists $Bin mathbb{R}^{ntimes n}$ so that $B^2=A$.
My attempt so far:
$$A-I= begin{bmatrix}
a_{11}-1 & a_{12} & ... & a_{1n} \
a_{21} & a_{22}-1 & ... & a_{2n} \
vdots \
a_{n1} & a_{n2} & ... & a_{nn}-1 \
end{bmatrix} $$
Taking $x=(1,0,...,0)in mathbb{R}^n$, we have that
$$|A-I|_{op}=underset{|x|=1}{sup}|(A-I)x|= sqrt{(a_{11}-1)^2+...+a_{n1}^2}<delta$$
Intuitively, this seems to suggest that for each basis vector of $mathbb{R}^n$ means that $A-I$ can be made close to the zero matrix, and hence, close to being a diagonal matrix. But I am having trouble going from here. Anyone have any hints?
real-analysis linear-algebra matrices
real-analysis linear-algebra matrices
asked Dec 13 '18 at 3:55
Joe Man AnalysisJoe Man Analysis
40619
40619
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Let
$I subset U subset Bbb R^{n times n}, tag 1$
where $U$ is open. Then we may define the matrix function
$F(B) = B^2:U to Bbb R^{n times n}, tag 2$
and we note that
$F(I) = I^2 = I, tag 3$
that is, $I$ is in the range of $F(B)$.
$F(B) = B^2$ is in fact differentiable; we have, for $Delta in Bbb R^{n times n}$,
$F(B + Delta) = (B + Delta)^2 = (B + Delta)(B + Delta) = B^2 + BDelta + Delta B + Delta^2, tag 4$
$F(B + Delta) - F(B) - (B Delta + Delta B) = (B + Delta)^2 - B^2 - (BDelta + Delta B) = Delta^2; tag 5$
$Vert F(B + Delta) - F(B) - (B Delta + Delta B) Vert = Vert (B + Delta)^2 - B^2 - (BDelta + Delta B) Vert = Vert Delta^2 Vert le Vert Delta Vert^2; tag 6$
since
$dfrac{Vert Delta Vert^2}{Vert Delta Vert} = Vert Delta Vert to 0 ; text{as} ; Delta to 0, tag 7$
we see that $F(B)$ is differentiable and that its derivative is the linear map
$DF(B)(Delta) = BDelta + Delta B; tag 8$
thus
$DF(I)(Delta) = I Delta + Delta I = 2 Delta, tag 9$
which is clearly non-singular. It follows from the inverse function theorem that there is some neighborhood $V$ of $I$ and a function
$R:V to U subset R^{n times n}, : R(I) = I, tag{10}$
such that for $A in V$
$F(R(A)) = (R(A))^2 = A; tag{11}$
if we now choose $delta > 0$ sufficiently small we may ensure that the set
$B(I, delta) = {A in Bbb R^{n times n}, ; Vert I - A Vert < delta } subset V, tag{12}$
and thus for $A in B(I, delta)$ we may set
$B = R(A), tag{13}$
and we have
$B^2 = (R(A))^2 = F(R(A)) = A, tag{14}$
as desired. $OEDelta$
$endgroup$
add a comment |
$begingroup$
Hint:
$$
sqrt{x}=sqrt{1-(1-x)}=sum_{k=0}^infty {1/2choose k} (x-1)^k
$$
By the binomial series. Can you use the above taking $x=A$ and proving it is a norm convergent series for a square root of $A$ with your bound?
$endgroup$
add a comment |
$begingroup$
$x^{T}Axgeq x^{T}x+x^{T}(A-I)x geq (1-delta) x^{T}x geq 0$. Hence $A$ (being non -negative definite) is digonalizable with non-negative diagonal entries. Replace diagonal entries by their square roots to get $B$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037574%2fshowing-that-any-square-matrix-in-mathbbrn-times-n-has-a-square-root%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let
$I subset U subset Bbb R^{n times n}, tag 1$
where $U$ is open. Then we may define the matrix function
$F(B) = B^2:U to Bbb R^{n times n}, tag 2$
and we note that
$F(I) = I^2 = I, tag 3$
that is, $I$ is in the range of $F(B)$.
$F(B) = B^2$ is in fact differentiable; we have, for $Delta in Bbb R^{n times n}$,
$F(B + Delta) = (B + Delta)^2 = (B + Delta)(B + Delta) = B^2 + BDelta + Delta B + Delta^2, tag 4$
$F(B + Delta) - F(B) - (B Delta + Delta B) = (B + Delta)^2 - B^2 - (BDelta + Delta B) = Delta^2; tag 5$
$Vert F(B + Delta) - F(B) - (B Delta + Delta B) Vert = Vert (B + Delta)^2 - B^2 - (BDelta + Delta B) Vert = Vert Delta^2 Vert le Vert Delta Vert^2; tag 6$
since
$dfrac{Vert Delta Vert^2}{Vert Delta Vert} = Vert Delta Vert to 0 ; text{as} ; Delta to 0, tag 7$
we see that $F(B)$ is differentiable and that its derivative is the linear map
$DF(B)(Delta) = BDelta + Delta B; tag 8$
thus
$DF(I)(Delta) = I Delta + Delta I = 2 Delta, tag 9$
which is clearly non-singular. It follows from the inverse function theorem that there is some neighborhood $V$ of $I$ and a function
$R:V to U subset R^{n times n}, : R(I) = I, tag{10}$
such that for $A in V$
$F(R(A)) = (R(A))^2 = A; tag{11}$
if we now choose $delta > 0$ sufficiently small we may ensure that the set
$B(I, delta) = {A in Bbb R^{n times n}, ; Vert I - A Vert < delta } subset V, tag{12}$
and thus for $A in B(I, delta)$ we may set
$B = R(A), tag{13}$
and we have
$B^2 = (R(A))^2 = F(R(A)) = A, tag{14}$
as desired. $OEDelta$
$endgroup$
add a comment |
$begingroup$
Let
$I subset U subset Bbb R^{n times n}, tag 1$
where $U$ is open. Then we may define the matrix function
$F(B) = B^2:U to Bbb R^{n times n}, tag 2$
and we note that
$F(I) = I^2 = I, tag 3$
that is, $I$ is in the range of $F(B)$.
$F(B) = B^2$ is in fact differentiable; we have, for $Delta in Bbb R^{n times n}$,
$F(B + Delta) = (B + Delta)^2 = (B + Delta)(B + Delta) = B^2 + BDelta + Delta B + Delta^2, tag 4$
$F(B + Delta) - F(B) - (B Delta + Delta B) = (B + Delta)^2 - B^2 - (BDelta + Delta B) = Delta^2; tag 5$
$Vert F(B + Delta) - F(B) - (B Delta + Delta B) Vert = Vert (B + Delta)^2 - B^2 - (BDelta + Delta B) Vert = Vert Delta^2 Vert le Vert Delta Vert^2; tag 6$
since
$dfrac{Vert Delta Vert^2}{Vert Delta Vert} = Vert Delta Vert to 0 ; text{as} ; Delta to 0, tag 7$
we see that $F(B)$ is differentiable and that its derivative is the linear map
$DF(B)(Delta) = BDelta + Delta B; tag 8$
thus
$DF(I)(Delta) = I Delta + Delta I = 2 Delta, tag 9$
which is clearly non-singular. It follows from the inverse function theorem that there is some neighborhood $V$ of $I$ and a function
$R:V to U subset R^{n times n}, : R(I) = I, tag{10}$
such that for $A in V$
$F(R(A)) = (R(A))^2 = A; tag{11}$
if we now choose $delta > 0$ sufficiently small we may ensure that the set
$B(I, delta) = {A in Bbb R^{n times n}, ; Vert I - A Vert < delta } subset V, tag{12}$
and thus for $A in B(I, delta)$ we may set
$B = R(A), tag{13}$
and we have
$B^2 = (R(A))^2 = F(R(A)) = A, tag{14}$
as desired. $OEDelta$
$endgroup$
add a comment |
$begingroup$
Let
$I subset U subset Bbb R^{n times n}, tag 1$
where $U$ is open. Then we may define the matrix function
$F(B) = B^2:U to Bbb R^{n times n}, tag 2$
and we note that
$F(I) = I^2 = I, tag 3$
that is, $I$ is in the range of $F(B)$.
$F(B) = B^2$ is in fact differentiable; we have, for $Delta in Bbb R^{n times n}$,
$F(B + Delta) = (B + Delta)^2 = (B + Delta)(B + Delta) = B^2 + BDelta + Delta B + Delta^2, tag 4$
$F(B + Delta) - F(B) - (B Delta + Delta B) = (B + Delta)^2 - B^2 - (BDelta + Delta B) = Delta^2; tag 5$
$Vert F(B + Delta) - F(B) - (B Delta + Delta B) Vert = Vert (B + Delta)^2 - B^2 - (BDelta + Delta B) Vert = Vert Delta^2 Vert le Vert Delta Vert^2; tag 6$
since
$dfrac{Vert Delta Vert^2}{Vert Delta Vert} = Vert Delta Vert to 0 ; text{as} ; Delta to 0, tag 7$
we see that $F(B)$ is differentiable and that its derivative is the linear map
$DF(B)(Delta) = BDelta + Delta B; tag 8$
thus
$DF(I)(Delta) = I Delta + Delta I = 2 Delta, tag 9$
which is clearly non-singular. It follows from the inverse function theorem that there is some neighborhood $V$ of $I$ and a function
$R:V to U subset R^{n times n}, : R(I) = I, tag{10}$
such that for $A in V$
$F(R(A)) = (R(A))^2 = A; tag{11}$
if we now choose $delta > 0$ sufficiently small we may ensure that the set
$B(I, delta) = {A in Bbb R^{n times n}, ; Vert I - A Vert < delta } subset V, tag{12}$
and thus for $A in B(I, delta)$ we may set
$B = R(A), tag{13}$
and we have
$B^2 = (R(A))^2 = F(R(A)) = A, tag{14}$
as desired. $OEDelta$
$endgroup$
Let
$I subset U subset Bbb R^{n times n}, tag 1$
where $U$ is open. Then we may define the matrix function
$F(B) = B^2:U to Bbb R^{n times n}, tag 2$
and we note that
$F(I) = I^2 = I, tag 3$
that is, $I$ is in the range of $F(B)$.
$F(B) = B^2$ is in fact differentiable; we have, for $Delta in Bbb R^{n times n}$,
$F(B + Delta) = (B + Delta)^2 = (B + Delta)(B + Delta) = B^2 + BDelta + Delta B + Delta^2, tag 4$
$F(B + Delta) - F(B) - (B Delta + Delta B) = (B + Delta)^2 - B^2 - (BDelta + Delta B) = Delta^2; tag 5$
$Vert F(B + Delta) - F(B) - (B Delta + Delta B) Vert = Vert (B + Delta)^2 - B^2 - (BDelta + Delta B) Vert = Vert Delta^2 Vert le Vert Delta Vert^2; tag 6$
since
$dfrac{Vert Delta Vert^2}{Vert Delta Vert} = Vert Delta Vert to 0 ; text{as} ; Delta to 0, tag 7$
we see that $F(B)$ is differentiable and that its derivative is the linear map
$DF(B)(Delta) = BDelta + Delta B; tag 8$
thus
$DF(I)(Delta) = I Delta + Delta I = 2 Delta, tag 9$
which is clearly non-singular. It follows from the inverse function theorem that there is some neighborhood $V$ of $I$ and a function
$R:V to U subset R^{n times n}, : R(I) = I, tag{10}$
such that for $A in V$
$F(R(A)) = (R(A))^2 = A; tag{11}$
if we now choose $delta > 0$ sufficiently small we may ensure that the set
$B(I, delta) = {A in Bbb R^{n times n}, ; Vert I - A Vert < delta } subset V, tag{12}$
and thus for $A in B(I, delta)$ we may set
$B = R(A), tag{13}$
and we have
$B^2 = (R(A))^2 = F(R(A)) = A, tag{14}$
as desired. $OEDelta$
edited Dec 13 '18 at 17:55
answered Dec 13 '18 at 17:31
Robert LewisRobert Lewis
46.5k23067
46.5k23067
add a comment |
add a comment |
$begingroup$
Hint:
$$
sqrt{x}=sqrt{1-(1-x)}=sum_{k=0}^infty {1/2choose k} (x-1)^k
$$
By the binomial series. Can you use the above taking $x=A$ and proving it is a norm convergent series for a square root of $A$ with your bound?
$endgroup$
add a comment |
$begingroup$
Hint:
$$
sqrt{x}=sqrt{1-(1-x)}=sum_{k=0}^infty {1/2choose k} (x-1)^k
$$
By the binomial series. Can you use the above taking $x=A$ and proving it is a norm convergent series for a square root of $A$ with your bound?
$endgroup$
add a comment |
$begingroup$
Hint:
$$
sqrt{x}=sqrt{1-(1-x)}=sum_{k=0}^infty {1/2choose k} (x-1)^k
$$
By the binomial series. Can you use the above taking $x=A$ and proving it is a norm convergent series for a square root of $A$ with your bound?
$endgroup$
Hint:
$$
sqrt{x}=sqrt{1-(1-x)}=sum_{k=0}^infty {1/2choose k} (x-1)^k
$$
By the binomial series. Can you use the above taking $x=A$ and proving it is a norm convergent series for a square root of $A$ with your bound?
answered Dec 13 '18 at 4:04
qbertqbert
22.1k32561
22.1k32561
add a comment |
add a comment |
$begingroup$
$x^{T}Axgeq x^{T}x+x^{T}(A-I)x geq (1-delta) x^{T}x geq 0$. Hence $A$ (being non -negative definite) is digonalizable with non-negative diagonal entries. Replace diagonal entries by their square roots to get $B$.
$endgroup$
add a comment |
$begingroup$
$x^{T}Axgeq x^{T}x+x^{T}(A-I)x geq (1-delta) x^{T}x geq 0$. Hence $A$ (being non -negative definite) is digonalizable with non-negative diagonal entries. Replace diagonal entries by their square roots to get $B$.
$endgroup$
add a comment |
$begingroup$
$x^{T}Axgeq x^{T}x+x^{T}(A-I)x geq (1-delta) x^{T}x geq 0$. Hence $A$ (being non -negative definite) is digonalizable with non-negative diagonal entries. Replace diagonal entries by their square roots to get $B$.
$endgroup$
$x^{T}Axgeq x^{T}x+x^{T}(A-I)x geq (1-delta) x^{T}x geq 0$. Hence $A$ (being non -negative definite) is digonalizable with non-negative diagonal entries. Replace diagonal entries by their square roots to get $B$.
answered Dec 13 '18 at 5:32
Kavi Rama MurthyKavi Rama Murthy
60k42161
60k42161
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037574%2fshowing-that-any-square-matrix-in-mathbbrn-times-n-has-a-square-root%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown