Measuring the Difference of Planes












2












$begingroup$


I've been trying to pin down the following intuitive statement:



Let $pi$ be a k-dimensional subspace in $mathbb R^N$, with ON basis $e_1,ldots, e_k$, and extend to an ON basis $e_1,ldots, e_k, f_1,ldots, f_{N-k}$ of $mathbb R^N$. Let $v_1,ldots, v_{N-k}$ be vectors in $mathbb pi.$



Now define the tilted plane $barpi$ by $$barpi={y+L(y) : yin pi} $$ where $$ L(y):= sum_{j=1}^{N-k}(v_jcdot y)f_j$$ Thus $barpi$ is $pi$ after being ``tilted'' in the directions of the $v_j$.



The estimate I want is the following: $$|P_pi-P_{barpi}|leqslant Csum_{j=1}^{N-k}|v_j|^2$$ (or possibly a $1$ for the last exponent) for a purely dimensional constant C not depending upon the $v_j$, where the $P_{-}$ are the orthogonal projections onto the respective planes.



As for progress I've made, a basis $zeta_1,ldots,zeta_k$ of $barpi$ defined by $$ zeta_i=e_i+L(e_i)=e_i+sum_{j=1}^{N-k}(v_jcdot e_i)f_j$$ should yield the metric $$g=I+L^tL$$ where $L$ is the matrix whose $i,j$ entry is $v_icdot e_j$.



This yields an estimate of the desired form $$|g-I|leqslant Csum_{j=1}^{N-k}|v_j|^2$$.



If the RHS was small enough, we could apply the Neumann series theorem and invert $g$ and get estimates, but I cant do this without choosing a new norm depending on the $v_j$, thus introducing non-dimensional dependencies into the constant C.



At any rate, just subtracting the projections on a vector $x$ of norm 1 yields something like $$P_{barpi}x-P_pi x=(delta^{ij}-g^{ij})(xcdot e_i)e_j-g^{ij}(xcdot L(e_i))e_j-g^{ij}(xcdot e_i)L(e_j)-g^{ij}(xcdot L(e_i))L(e_j).$$ Sooooo----any estimate has to come from the $g^{ij}$, and I am at a loss for how to do this. Of course, an estimate on the inverse metric would yield an estimate on $g^{ij}-delta^{ij}=g^{ik}(delta_{kj}-g_{kj})$ taking care of the first term as well. A potential issue I've come across in my scratch work is that it's very easy to introduce higher powers of the $|v_j|$ which I can't control, as far as I can tell.



Perhaps a related question is what may be said about estimating the norm of the matrix $(I+A)^{-1}$ where $A$ is symmetric. I've seen some posts around that give some rather convoluted expressions involving $A^{-1}$, which I would rather not deal with if possible since then I need estimates on the entries of the inverse.



Through and through all of this, I am quite shocked that this hasnt been more immediate--the statement is so intuitive after all!



Thanks so much!










share|cite|improve this question











$endgroup$












  • $begingroup$
    deleted a previous comment of mine because it was total hogwash--
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 6:48


















2












$begingroup$


I've been trying to pin down the following intuitive statement:



Let $pi$ be a k-dimensional subspace in $mathbb R^N$, with ON basis $e_1,ldots, e_k$, and extend to an ON basis $e_1,ldots, e_k, f_1,ldots, f_{N-k}$ of $mathbb R^N$. Let $v_1,ldots, v_{N-k}$ be vectors in $mathbb pi.$



Now define the tilted plane $barpi$ by $$barpi={y+L(y) : yin pi} $$ where $$ L(y):= sum_{j=1}^{N-k}(v_jcdot y)f_j$$ Thus $barpi$ is $pi$ after being ``tilted'' in the directions of the $v_j$.



The estimate I want is the following: $$|P_pi-P_{barpi}|leqslant Csum_{j=1}^{N-k}|v_j|^2$$ (or possibly a $1$ for the last exponent) for a purely dimensional constant C not depending upon the $v_j$, where the $P_{-}$ are the orthogonal projections onto the respective planes.



As for progress I've made, a basis $zeta_1,ldots,zeta_k$ of $barpi$ defined by $$ zeta_i=e_i+L(e_i)=e_i+sum_{j=1}^{N-k}(v_jcdot e_i)f_j$$ should yield the metric $$g=I+L^tL$$ where $L$ is the matrix whose $i,j$ entry is $v_icdot e_j$.



This yields an estimate of the desired form $$|g-I|leqslant Csum_{j=1}^{N-k}|v_j|^2$$.



If the RHS was small enough, we could apply the Neumann series theorem and invert $g$ and get estimates, but I cant do this without choosing a new norm depending on the $v_j$, thus introducing non-dimensional dependencies into the constant C.



At any rate, just subtracting the projections on a vector $x$ of norm 1 yields something like $$P_{barpi}x-P_pi x=(delta^{ij}-g^{ij})(xcdot e_i)e_j-g^{ij}(xcdot L(e_i))e_j-g^{ij}(xcdot e_i)L(e_j)-g^{ij}(xcdot L(e_i))L(e_j).$$ Sooooo----any estimate has to come from the $g^{ij}$, and I am at a loss for how to do this. Of course, an estimate on the inverse metric would yield an estimate on $g^{ij}-delta^{ij}=g^{ik}(delta_{kj}-g_{kj})$ taking care of the first term as well. A potential issue I've come across in my scratch work is that it's very easy to introduce higher powers of the $|v_j|$ which I can't control, as far as I can tell.



Perhaps a related question is what may be said about estimating the norm of the matrix $(I+A)^{-1}$ where $A$ is symmetric. I've seen some posts around that give some rather convoluted expressions involving $A^{-1}$, which I would rather not deal with if possible since then I need estimates on the entries of the inverse.



Through and through all of this, I am quite shocked that this hasnt been more immediate--the statement is so intuitive after all!



Thanks so much!










share|cite|improve this question











$endgroup$












  • $begingroup$
    deleted a previous comment of mine because it was total hogwash--
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 6:48
















2












2








2





$begingroup$


I've been trying to pin down the following intuitive statement:



Let $pi$ be a k-dimensional subspace in $mathbb R^N$, with ON basis $e_1,ldots, e_k$, and extend to an ON basis $e_1,ldots, e_k, f_1,ldots, f_{N-k}$ of $mathbb R^N$. Let $v_1,ldots, v_{N-k}$ be vectors in $mathbb pi.$



Now define the tilted plane $barpi$ by $$barpi={y+L(y) : yin pi} $$ where $$ L(y):= sum_{j=1}^{N-k}(v_jcdot y)f_j$$ Thus $barpi$ is $pi$ after being ``tilted'' in the directions of the $v_j$.



The estimate I want is the following: $$|P_pi-P_{barpi}|leqslant Csum_{j=1}^{N-k}|v_j|^2$$ (or possibly a $1$ for the last exponent) for a purely dimensional constant C not depending upon the $v_j$, where the $P_{-}$ are the orthogonal projections onto the respective planes.



As for progress I've made, a basis $zeta_1,ldots,zeta_k$ of $barpi$ defined by $$ zeta_i=e_i+L(e_i)=e_i+sum_{j=1}^{N-k}(v_jcdot e_i)f_j$$ should yield the metric $$g=I+L^tL$$ where $L$ is the matrix whose $i,j$ entry is $v_icdot e_j$.



This yields an estimate of the desired form $$|g-I|leqslant Csum_{j=1}^{N-k}|v_j|^2$$.



If the RHS was small enough, we could apply the Neumann series theorem and invert $g$ and get estimates, but I cant do this without choosing a new norm depending on the $v_j$, thus introducing non-dimensional dependencies into the constant C.



At any rate, just subtracting the projections on a vector $x$ of norm 1 yields something like $$P_{barpi}x-P_pi x=(delta^{ij}-g^{ij})(xcdot e_i)e_j-g^{ij}(xcdot L(e_i))e_j-g^{ij}(xcdot e_i)L(e_j)-g^{ij}(xcdot L(e_i))L(e_j).$$ Sooooo----any estimate has to come from the $g^{ij}$, and I am at a loss for how to do this. Of course, an estimate on the inverse metric would yield an estimate on $g^{ij}-delta^{ij}=g^{ik}(delta_{kj}-g_{kj})$ taking care of the first term as well. A potential issue I've come across in my scratch work is that it's very easy to introduce higher powers of the $|v_j|$ which I can't control, as far as I can tell.



Perhaps a related question is what may be said about estimating the norm of the matrix $(I+A)^{-1}$ where $A$ is symmetric. I've seen some posts around that give some rather convoluted expressions involving $A^{-1}$, which I would rather not deal with if possible since then I need estimates on the entries of the inverse.



Through and through all of this, I am quite shocked that this hasnt been more immediate--the statement is so intuitive after all!



Thanks so much!










share|cite|improve this question











$endgroup$




I've been trying to pin down the following intuitive statement:



Let $pi$ be a k-dimensional subspace in $mathbb R^N$, with ON basis $e_1,ldots, e_k$, and extend to an ON basis $e_1,ldots, e_k, f_1,ldots, f_{N-k}$ of $mathbb R^N$. Let $v_1,ldots, v_{N-k}$ be vectors in $mathbb pi.$



Now define the tilted plane $barpi$ by $$barpi={y+L(y) : yin pi} $$ where $$ L(y):= sum_{j=1}^{N-k}(v_jcdot y)f_j$$ Thus $barpi$ is $pi$ after being ``tilted'' in the directions of the $v_j$.



The estimate I want is the following: $$|P_pi-P_{barpi}|leqslant Csum_{j=1}^{N-k}|v_j|^2$$ (or possibly a $1$ for the last exponent) for a purely dimensional constant C not depending upon the $v_j$, where the $P_{-}$ are the orthogonal projections onto the respective planes.



As for progress I've made, a basis $zeta_1,ldots,zeta_k$ of $barpi$ defined by $$ zeta_i=e_i+L(e_i)=e_i+sum_{j=1}^{N-k}(v_jcdot e_i)f_j$$ should yield the metric $$g=I+L^tL$$ where $L$ is the matrix whose $i,j$ entry is $v_icdot e_j$.



This yields an estimate of the desired form $$|g-I|leqslant Csum_{j=1}^{N-k}|v_j|^2$$.



If the RHS was small enough, we could apply the Neumann series theorem and invert $g$ and get estimates, but I cant do this without choosing a new norm depending on the $v_j$, thus introducing non-dimensional dependencies into the constant C.



At any rate, just subtracting the projections on a vector $x$ of norm 1 yields something like $$P_{barpi}x-P_pi x=(delta^{ij}-g^{ij})(xcdot e_i)e_j-g^{ij}(xcdot L(e_i))e_j-g^{ij}(xcdot e_i)L(e_j)-g^{ij}(xcdot L(e_i))L(e_j).$$ Sooooo----any estimate has to come from the $g^{ij}$, and I am at a loss for how to do this. Of course, an estimate on the inverse metric would yield an estimate on $g^{ij}-delta^{ij}=g^{ik}(delta_{kj}-g_{kj})$ taking care of the first term as well. A potential issue I've come across in my scratch work is that it's very easy to introduce higher powers of the $|v_j|$ which I can't control, as far as I can tell.



Perhaps a related question is what may be said about estimating the norm of the matrix $(I+A)^{-1}$ where $A$ is symmetric. I've seen some posts around that give some rather convoluted expressions involving $A^{-1}$, which I would rather not deal with if possible since then I need estimates on the entries of the inverse.



Through and through all of this, I am quite shocked that this hasnt been more immediate--the statement is so intuitive after all!



Thanks so much!







linear-algebra analysis differential-geometry riemannian-geometry






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 27 '18 at 6:04







Hunter Stufflebeam

















asked Dec 26 '18 at 19:14









Hunter StufflebeamHunter Stufflebeam

215




215












  • $begingroup$
    deleted a previous comment of mine because it was total hogwash--
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 6:48




















  • $begingroup$
    deleted a previous comment of mine because it was total hogwash--
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 6:48


















$begingroup$
deleted a previous comment of mine because it was total hogwash--
$endgroup$
– Hunter Stufflebeam
Dec 27 '18 at 6:48






$begingroup$
deleted a previous comment of mine because it was total hogwash--
$endgroup$
– Hunter Stufflebeam
Dec 27 '18 at 6:48












1 Answer
1






active

oldest

votes


















0












$begingroup$

A friend came up with the following simplified approach:



Let $T:=P_{barpi}-P_pi$, and let $C=sum_{j=1}^{N-k}|v_j|$. We aim to show that $$|Tx|leqslant C|x| $$ for all $xinmathbb R^N$.



It suffices to show this boundedness separately when $xin pi$ and when $xinpi^perp$, since in any $zinmathbb R^n$ has an orthogonal decomposition with components in $pi$ and $pi^perp$, allowing us to estimate $$|Tx|=|Tx^mathrm{T}+Tx^perp|leqslant |Tx^mathrm{T}|+|Tx^perp|leqslant C|x^mathrm{T}|+C|x^perp|leqslant sqrt{2}C|x|$$ where the last inequality follows from Jensen.



Thus, in case $xinpi$ and $|x|=1$, we have $$ |P_{pi}x-P_{barpi}x|=|x-P_{barpi}x|leqslant|x-y| $$ for all $yinbarpi$. In particular the inequality holds for $y=x+L(x)$, and the estimate follows.



In case $xinpi^perp$ and $|x|=1$, we have $$|P_{barpi} x-P_pi x|^2=|P_{barpi} x|^2=langle P_{barpi} x,P_{barpi} xrangle =frac{langle x,P_{barpi} xrangle^2}{|P_{barpi} x|^2}=frac{langle x,y+L(y)rangle^2}{|y+L(y)|^2}$$ for some $yin pi$ which yields



$$|P_{barpi} x-P_pi x|leqslantleft(frac{|L(y)|^2}{|y|^2+|L(y)|^2}right)^{1/2}leqslant =left(frac{|L(hat y)|^2}{1+|L(hat y)|^2}right)^{1/2}leqslant C$$ where we have written $hat y=y/|y|$.



And that's it!






share|cite|improve this answer











$endgroup$













  • $begingroup$
    eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 22:39











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3053229%2fmeasuring-the-difference-of-planes%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

A friend came up with the following simplified approach:



Let $T:=P_{barpi}-P_pi$, and let $C=sum_{j=1}^{N-k}|v_j|$. We aim to show that $$|Tx|leqslant C|x| $$ for all $xinmathbb R^N$.



It suffices to show this boundedness separately when $xin pi$ and when $xinpi^perp$, since in any $zinmathbb R^n$ has an orthogonal decomposition with components in $pi$ and $pi^perp$, allowing us to estimate $$|Tx|=|Tx^mathrm{T}+Tx^perp|leqslant |Tx^mathrm{T}|+|Tx^perp|leqslant C|x^mathrm{T}|+C|x^perp|leqslant sqrt{2}C|x|$$ where the last inequality follows from Jensen.



Thus, in case $xinpi$ and $|x|=1$, we have $$ |P_{pi}x-P_{barpi}x|=|x-P_{barpi}x|leqslant|x-y| $$ for all $yinbarpi$. In particular the inequality holds for $y=x+L(x)$, and the estimate follows.



In case $xinpi^perp$ and $|x|=1$, we have $$|P_{barpi} x-P_pi x|^2=|P_{barpi} x|^2=langle P_{barpi} x,P_{barpi} xrangle =frac{langle x,P_{barpi} xrangle^2}{|P_{barpi} x|^2}=frac{langle x,y+L(y)rangle^2}{|y+L(y)|^2}$$ for some $yin pi$ which yields



$$|P_{barpi} x-P_pi x|leqslantleft(frac{|L(y)|^2}{|y|^2+|L(y)|^2}right)^{1/2}leqslant =left(frac{|L(hat y)|^2}{1+|L(hat y)|^2}right)^{1/2}leqslant C$$ where we have written $hat y=y/|y|$.



And that's it!






share|cite|improve this answer











$endgroup$













  • $begingroup$
    eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 22:39
















0












$begingroup$

A friend came up with the following simplified approach:



Let $T:=P_{barpi}-P_pi$, and let $C=sum_{j=1}^{N-k}|v_j|$. We aim to show that $$|Tx|leqslant C|x| $$ for all $xinmathbb R^N$.



It suffices to show this boundedness separately when $xin pi$ and when $xinpi^perp$, since in any $zinmathbb R^n$ has an orthogonal decomposition with components in $pi$ and $pi^perp$, allowing us to estimate $$|Tx|=|Tx^mathrm{T}+Tx^perp|leqslant |Tx^mathrm{T}|+|Tx^perp|leqslant C|x^mathrm{T}|+C|x^perp|leqslant sqrt{2}C|x|$$ where the last inequality follows from Jensen.



Thus, in case $xinpi$ and $|x|=1$, we have $$ |P_{pi}x-P_{barpi}x|=|x-P_{barpi}x|leqslant|x-y| $$ for all $yinbarpi$. In particular the inequality holds for $y=x+L(x)$, and the estimate follows.



In case $xinpi^perp$ and $|x|=1$, we have $$|P_{barpi} x-P_pi x|^2=|P_{barpi} x|^2=langle P_{barpi} x,P_{barpi} xrangle =frac{langle x,P_{barpi} xrangle^2}{|P_{barpi} x|^2}=frac{langle x,y+L(y)rangle^2}{|y+L(y)|^2}$$ for some $yin pi$ which yields



$$|P_{barpi} x-P_pi x|leqslantleft(frac{|L(y)|^2}{|y|^2+|L(y)|^2}right)^{1/2}leqslant =left(frac{|L(hat y)|^2}{1+|L(hat y)|^2}right)^{1/2}leqslant C$$ where we have written $hat y=y/|y|$.



And that's it!






share|cite|improve this answer











$endgroup$













  • $begingroup$
    eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 22:39














0












0








0





$begingroup$

A friend came up with the following simplified approach:



Let $T:=P_{barpi}-P_pi$, and let $C=sum_{j=1}^{N-k}|v_j|$. We aim to show that $$|Tx|leqslant C|x| $$ for all $xinmathbb R^N$.



It suffices to show this boundedness separately when $xin pi$ and when $xinpi^perp$, since in any $zinmathbb R^n$ has an orthogonal decomposition with components in $pi$ and $pi^perp$, allowing us to estimate $$|Tx|=|Tx^mathrm{T}+Tx^perp|leqslant |Tx^mathrm{T}|+|Tx^perp|leqslant C|x^mathrm{T}|+C|x^perp|leqslant sqrt{2}C|x|$$ where the last inequality follows from Jensen.



Thus, in case $xinpi$ and $|x|=1$, we have $$ |P_{pi}x-P_{barpi}x|=|x-P_{barpi}x|leqslant|x-y| $$ for all $yinbarpi$. In particular the inequality holds for $y=x+L(x)$, and the estimate follows.



In case $xinpi^perp$ and $|x|=1$, we have $$|P_{barpi} x-P_pi x|^2=|P_{barpi} x|^2=langle P_{barpi} x,P_{barpi} xrangle =frac{langle x,P_{barpi} xrangle^2}{|P_{barpi} x|^2}=frac{langle x,y+L(y)rangle^2}{|y+L(y)|^2}$$ for some $yin pi$ which yields



$$|P_{barpi} x-P_pi x|leqslantleft(frac{|L(y)|^2}{|y|^2+|L(y)|^2}right)^{1/2}leqslant =left(frac{|L(hat y)|^2}{1+|L(hat y)|^2}right)^{1/2}leqslant C$$ where we have written $hat y=y/|y|$.



And that's it!






share|cite|improve this answer











$endgroup$



A friend came up with the following simplified approach:



Let $T:=P_{barpi}-P_pi$, and let $C=sum_{j=1}^{N-k}|v_j|$. We aim to show that $$|Tx|leqslant C|x| $$ for all $xinmathbb R^N$.



It suffices to show this boundedness separately when $xin pi$ and when $xinpi^perp$, since in any $zinmathbb R^n$ has an orthogonal decomposition with components in $pi$ and $pi^perp$, allowing us to estimate $$|Tx|=|Tx^mathrm{T}+Tx^perp|leqslant |Tx^mathrm{T}|+|Tx^perp|leqslant C|x^mathrm{T}|+C|x^perp|leqslant sqrt{2}C|x|$$ where the last inequality follows from Jensen.



Thus, in case $xinpi$ and $|x|=1$, we have $$ |P_{pi}x-P_{barpi}x|=|x-P_{barpi}x|leqslant|x-y| $$ for all $yinbarpi$. In particular the inequality holds for $y=x+L(x)$, and the estimate follows.



In case $xinpi^perp$ and $|x|=1$, we have $$|P_{barpi} x-P_pi x|^2=|P_{barpi} x|^2=langle P_{barpi} x,P_{barpi} xrangle =frac{langle x,P_{barpi} xrangle^2}{|P_{barpi} x|^2}=frac{langle x,y+L(y)rangle^2}{|y+L(y)|^2}$$ for some $yin pi$ which yields



$$|P_{barpi} x-P_pi x|leqslantleft(frac{|L(y)|^2}{|y|^2+|L(y)|^2}right)^{1/2}leqslant =left(frac{|L(hat y)|^2}{1+|L(hat y)|^2}right)^{1/2}leqslant C$$ where we have written $hat y=y/|y|$.



And that's it!







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 27 '18 at 22:38

























answered Dec 27 '18 at 18:25









Hunter StufflebeamHunter Stufflebeam

215




215












  • $begingroup$
    eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 22:39


















  • $begingroup$
    eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
    $endgroup$
    – Hunter Stufflebeam
    Dec 27 '18 at 22:39
















$begingroup$
eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
$endgroup$
– Hunter Stufflebeam
Dec 27 '18 at 22:39




$begingroup$
eh this version is fine, tried simplifying some stuff but why fix it if it aint broke
$endgroup$
– Hunter Stufflebeam
Dec 27 '18 at 22:39


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3053229%2fmeasuring-the-difference-of-planes%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix