Decomposing a positive semi-definite matrix with all -1,+1 elements











up vote
2
down vote

favorite
2












Claim.$,$ A matrix $,X in {-1,1}^{ntimes n},$ is positive semi-definite if and only if it is of the form $X= xx^{T}$, for some $x in {-1,1 }^n$.



How can I prove this? Proving the 'if' part is easy since $y^Txx^Ty = (x^Ty)^T(x^Ty) geq 0$ for any $y in mathbb{R}^n$. But, the other way around is not that straightforward.



Notice that, $X$ has all ones diagonal since diagonal of a psd matrix cannot have negative elements.










share|cite|improve this question




























    up vote
    2
    down vote

    favorite
    2












    Claim.$,$ A matrix $,X in {-1,1}^{ntimes n},$ is positive semi-definite if and only if it is of the form $X= xx^{T}$, for some $x in {-1,1 }^n$.



    How can I prove this? Proving the 'if' part is easy since $y^Txx^Ty = (x^Ty)^T(x^Ty) geq 0$ for any $y in mathbb{R}^n$. But, the other way around is not that straightforward.



    Notice that, $X$ has all ones diagonal since diagonal of a psd matrix cannot have negative elements.










    share|cite|improve this question


























      up vote
      2
      down vote

      favorite
      2









      up vote
      2
      down vote

      favorite
      2






      2





      Claim.$,$ A matrix $,X in {-1,1}^{ntimes n},$ is positive semi-definite if and only if it is of the form $X= xx^{T}$, for some $x in {-1,1 }^n$.



      How can I prove this? Proving the 'if' part is easy since $y^Txx^Ty = (x^Ty)^T(x^Ty) geq 0$ for any $y in mathbb{R}^n$. But, the other way around is not that straightforward.



      Notice that, $X$ has all ones diagonal since diagonal of a psd matrix cannot have negative elements.










      share|cite|improve this question















      Claim.$,$ A matrix $,X in {-1,1}^{ntimes n},$ is positive semi-definite if and only if it is of the form $X= xx^{T}$, for some $x in {-1,1 }^n$.



      How can I prove this? Proving the 'if' part is easy since $y^Txx^Ty = (x^Ty)^T(x^Ty) geq 0$ for any $y in mathbb{R}^n$. But, the other way around is not that straightforward.



      Notice that, $X$ has all ones diagonal since diagonal of a psd matrix cannot have negative elements.







      linear-algebra matrices matrix-calculus symmetric-matrices positive-semidefinite






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Dec 3 at 19:28









      Yiorgos S. Smyrlis

      62.3k1383162




      62.3k1383162










      asked Nov 22 at 18:45









      independentvariable

      11710




      11710






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          2
          down vote



          accepted










          Let $A$ be our matrix. The corresponding quadratic form should look like
          $$
          p(x_1,ldots,x_n)=boldsymbol x^tAboldsymbol x=x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j, tag{1}
          $$

          where $varepsilon=pm 1$ and $boldsymbol x=(x_1,ldots,x_n)$. We shall show that $p$ is positive semi-definite (psd), then
          $$
          p(x_1,ldots,x_n)=(varepsilon_1 x_1+cdots+varepsilon_n x_n)^2, tag{2}
          $$

          for suitable $varepsilon_j=pm 1$, $j=1,ldots,n$, in which case
          $A=(varepsilon_1,ldots,varepsilon_n)(varepsilon_1,ldots,varepsilon_n)^t$.



          If the expression $(1)$ is psd, then $$
          varepsilon_{ij}varepsilon_{jk}=varepsilon_{ik},quadtext{for all $ine jne kne i$.}
          tag{3}
          $$

          If $(3)$ fails, and for some $i,j,k$, we have
          $varepsilon_{ij}varepsilon_{jk}=-varepsilon_{ik}$, then letting $boldsymbol xin mathbb R$ be the vector having $varepsilon_{jk},varepsilon_{ik},varepsilon_{ij}$ in the posistions $i,j,k$, respectively, and zero everywhere else, it can be readily seen that
          $$
          boldsymbol x^tAboldsymbol x=varepsilon_{jk}^2
          +varepsilon_{ik}^2+varepsilon_{ij}^2+2
          (varepsilon_{ij}varepsilon_{jk}+varepsilon_{ij}varepsilon_{ik}
          +varepsilon_{jk}varepsilon_{ik})=3-6=-3<0.
          $$



          But if $(3)$ holds, then
          $$
          x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j=(x_1+varepsilon_{12}x_2+varepsilon_{12}varepsilon_{23}x_3+cdots+varepsilon_{12}cdotsvarepsilon_{n-1,n}x_n)^2
          $$



          Note. Another proof is based on the Pigeonhole Principle. There exist exactly $2^{n-1}$ symmetric matrices in ${-1,1}^{ntimes n},$ which satisfy $(3)$ and $varepsilon_{ii}=1$, for all $i$. Also, there exist exactly $2^{n-1}$ matrices of the form $boldsymbol xcdotboldsymbol x^t$, where $boldsymbol x$ contains elements in ${-1,1}$.






          share|cite|improve this answer



















          • 1




            How do you arrive at (3)? Why does a product of $epsilon$'s appear?
            – daw
            Nov 22 at 20:36










          • @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
            – Yiorgos S. Smyrlis
            Nov 22 at 21:11










          • Can you please provide more detail? This is not very clear. The last part is not obvious.
            – independentvariable
            Nov 23 at 13:27










          • @independentvariable See my modified answer.
            – Yiorgos S. Smyrlis
            Nov 23 at 18:55










          • Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
            – independentvariable
            Nov 23 at 18:56













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009499%2fdecomposing-a-positive-semi-definite-matrix-with-all-1-1-elements%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          2
          down vote



          accepted










          Let $A$ be our matrix. The corresponding quadratic form should look like
          $$
          p(x_1,ldots,x_n)=boldsymbol x^tAboldsymbol x=x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j, tag{1}
          $$

          where $varepsilon=pm 1$ and $boldsymbol x=(x_1,ldots,x_n)$. We shall show that $p$ is positive semi-definite (psd), then
          $$
          p(x_1,ldots,x_n)=(varepsilon_1 x_1+cdots+varepsilon_n x_n)^2, tag{2}
          $$

          for suitable $varepsilon_j=pm 1$, $j=1,ldots,n$, in which case
          $A=(varepsilon_1,ldots,varepsilon_n)(varepsilon_1,ldots,varepsilon_n)^t$.



          If the expression $(1)$ is psd, then $$
          varepsilon_{ij}varepsilon_{jk}=varepsilon_{ik},quadtext{for all $ine jne kne i$.}
          tag{3}
          $$

          If $(3)$ fails, and for some $i,j,k$, we have
          $varepsilon_{ij}varepsilon_{jk}=-varepsilon_{ik}$, then letting $boldsymbol xin mathbb R$ be the vector having $varepsilon_{jk},varepsilon_{ik},varepsilon_{ij}$ in the posistions $i,j,k$, respectively, and zero everywhere else, it can be readily seen that
          $$
          boldsymbol x^tAboldsymbol x=varepsilon_{jk}^2
          +varepsilon_{ik}^2+varepsilon_{ij}^2+2
          (varepsilon_{ij}varepsilon_{jk}+varepsilon_{ij}varepsilon_{ik}
          +varepsilon_{jk}varepsilon_{ik})=3-6=-3<0.
          $$



          But if $(3)$ holds, then
          $$
          x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j=(x_1+varepsilon_{12}x_2+varepsilon_{12}varepsilon_{23}x_3+cdots+varepsilon_{12}cdotsvarepsilon_{n-1,n}x_n)^2
          $$



          Note. Another proof is based on the Pigeonhole Principle. There exist exactly $2^{n-1}$ symmetric matrices in ${-1,1}^{ntimes n},$ which satisfy $(3)$ and $varepsilon_{ii}=1$, for all $i$. Also, there exist exactly $2^{n-1}$ matrices of the form $boldsymbol xcdotboldsymbol x^t$, where $boldsymbol x$ contains elements in ${-1,1}$.






          share|cite|improve this answer



















          • 1




            How do you arrive at (3)? Why does a product of $epsilon$'s appear?
            – daw
            Nov 22 at 20:36










          • @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
            – Yiorgos S. Smyrlis
            Nov 22 at 21:11










          • Can you please provide more detail? This is not very clear. The last part is not obvious.
            – independentvariable
            Nov 23 at 13:27










          • @independentvariable See my modified answer.
            – Yiorgos S. Smyrlis
            Nov 23 at 18:55










          • Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
            – independentvariable
            Nov 23 at 18:56

















          up vote
          2
          down vote



          accepted










          Let $A$ be our matrix. The corresponding quadratic form should look like
          $$
          p(x_1,ldots,x_n)=boldsymbol x^tAboldsymbol x=x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j, tag{1}
          $$

          where $varepsilon=pm 1$ and $boldsymbol x=(x_1,ldots,x_n)$. We shall show that $p$ is positive semi-definite (psd), then
          $$
          p(x_1,ldots,x_n)=(varepsilon_1 x_1+cdots+varepsilon_n x_n)^2, tag{2}
          $$

          for suitable $varepsilon_j=pm 1$, $j=1,ldots,n$, in which case
          $A=(varepsilon_1,ldots,varepsilon_n)(varepsilon_1,ldots,varepsilon_n)^t$.



          If the expression $(1)$ is psd, then $$
          varepsilon_{ij}varepsilon_{jk}=varepsilon_{ik},quadtext{for all $ine jne kne i$.}
          tag{3}
          $$

          If $(3)$ fails, and for some $i,j,k$, we have
          $varepsilon_{ij}varepsilon_{jk}=-varepsilon_{ik}$, then letting $boldsymbol xin mathbb R$ be the vector having $varepsilon_{jk},varepsilon_{ik},varepsilon_{ij}$ in the posistions $i,j,k$, respectively, and zero everywhere else, it can be readily seen that
          $$
          boldsymbol x^tAboldsymbol x=varepsilon_{jk}^2
          +varepsilon_{ik}^2+varepsilon_{ij}^2+2
          (varepsilon_{ij}varepsilon_{jk}+varepsilon_{ij}varepsilon_{ik}
          +varepsilon_{jk}varepsilon_{ik})=3-6=-3<0.
          $$



          But if $(3)$ holds, then
          $$
          x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j=(x_1+varepsilon_{12}x_2+varepsilon_{12}varepsilon_{23}x_3+cdots+varepsilon_{12}cdotsvarepsilon_{n-1,n}x_n)^2
          $$



          Note. Another proof is based on the Pigeonhole Principle. There exist exactly $2^{n-1}$ symmetric matrices in ${-1,1}^{ntimes n},$ which satisfy $(3)$ and $varepsilon_{ii}=1$, for all $i$. Also, there exist exactly $2^{n-1}$ matrices of the form $boldsymbol xcdotboldsymbol x^t$, where $boldsymbol x$ contains elements in ${-1,1}$.






          share|cite|improve this answer



















          • 1




            How do you arrive at (3)? Why does a product of $epsilon$'s appear?
            – daw
            Nov 22 at 20:36










          • @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
            – Yiorgos S. Smyrlis
            Nov 22 at 21:11










          • Can you please provide more detail? This is not very clear. The last part is not obvious.
            – independentvariable
            Nov 23 at 13:27










          • @independentvariable See my modified answer.
            – Yiorgos S. Smyrlis
            Nov 23 at 18:55










          • Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
            – independentvariable
            Nov 23 at 18:56















          up vote
          2
          down vote



          accepted







          up vote
          2
          down vote



          accepted






          Let $A$ be our matrix. The corresponding quadratic form should look like
          $$
          p(x_1,ldots,x_n)=boldsymbol x^tAboldsymbol x=x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j, tag{1}
          $$

          where $varepsilon=pm 1$ and $boldsymbol x=(x_1,ldots,x_n)$. We shall show that $p$ is positive semi-definite (psd), then
          $$
          p(x_1,ldots,x_n)=(varepsilon_1 x_1+cdots+varepsilon_n x_n)^2, tag{2}
          $$

          for suitable $varepsilon_j=pm 1$, $j=1,ldots,n$, in which case
          $A=(varepsilon_1,ldots,varepsilon_n)(varepsilon_1,ldots,varepsilon_n)^t$.



          If the expression $(1)$ is psd, then $$
          varepsilon_{ij}varepsilon_{jk}=varepsilon_{ik},quadtext{for all $ine jne kne i$.}
          tag{3}
          $$

          If $(3)$ fails, and for some $i,j,k$, we have
          $varepsilon_{ij}varepsilon_{jk}=-varepsilon_{ik}$, then letting $boldsymbol xin mathbb R$ be the vector having $varepsilon_{jk},varepsilon_{ik},varepsilon_{ij}$ in the posistions $i,j,k$, respectively, and zero everywhere else, it can be readily seen that
          $$
          boldsymbol x^tAboldsymbol x=varepsilon_{jk}^2
          +varepsilon_{ik}^2+varepsilon_{ij}^2+2
          (varepsilon_{ij}varepsilon_{jk}+varepsilon_{ij}varepsilon_{ik}
          +varepsilon_{jk}varepsilon_{ik})=3-6=-3<0.
          $$



          But if $(3)$ holds, then
          $$
          x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j=(x_1+varepsilon_{12}x_2+varepsilon_{12}varepsilon_{23}x_3+cdots+varepsilon_{12}cdotsvarepsilon_{n-1,n}x_n)^2
          $$



          Note. Another proof is based on the Pigeonhole Principle. There exist exactly $2^{n-1}$ symmetric matrices in ${-1,1}^{ntimes n},$ which satisfy $(3)$ and $varepsilon_{ii}=1$, for all $i$. Also, there exist exactly $2^{n-1}$ matrices of the form $boldsymbol xcdotboldsymbol x^t$, where $boldsymbol x$ contains elements in ${-1,1}$.






          share|cite|improve this answer














          Let $A$ be our matrix. The corresponding quadratic form should look like
          $$
          p(x_1,ldots,x_n)=boldsymbol x^tAboldsymbol x=x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j, tag{1}
          $$

          where $varepsilon=pm 1$ and $boldsymbol x=(x_1,ldots,x_n)$. We shall show that $p$ is positive semi-definite (psd), then
          $$
          p(x_1,ldots,x_n)=(varepsilon_1 x_1+cdots+varepsilon_n x_n)^2, tag{2}
          $$

          for suitable $varepsilon_j=pm 1$, $j=1,ldots,n$, in which case
          $A=(varepsilon_1,ldots,varepsilon_n)(varepsilon_1,ldots,varepsilon_n)^t$.



          If the expression $(1)$ is psd, then $$
          varepsilon_{ij}varepsilon_{jk}=varepsilon_{ik},quadtext{for all $ine jne kne i$.}
          tag{3}
          $$

          If $(3)$ fails, and for some $i,j,k$, we have
          $varepsilon_{ij}varepsilon_{jk}=-varepsilon_{ik}$, then letting $boldsymbol xin mathbb R$ be the vector having $varepsilon_{jk},varepsilon_{ik},varepsilon_{ij}$ in the posistions $i,j,k$, respectively, and zero everywhere else, it can be readily seen that
          $$
          boldsymbol x^tAboldsymbol x=varepsilon_{jk}^2
          +varepsilon_{ik}^2+varepsilon_{ij}^2+2
          (varepsilon_{ij}varepsilon_{jk}+varepsilon_{ij}varepsilon_{ik}
          +varepsilon_{jk}varepsilon_{ik})=3-6=-3<0.
          $$



          But if $(3)$ holds, then
          $$
          x_1^n+cdots+x_n^2+2sum_{1le i<jle n}varepsilon_{ij}x_ix_j=(x_1+varepsilon_{12}x_2+varepsilon_{12}varepsilon_{23}x_3+cdots+varepsilon_{12}cdotsvarepsilon_{n-1,n}x_n)^2
          $$



          Note. Another proof is based on the Pigeonhole Principle. There exist exactly $2^{n-1}$ symmetric matrices in ${-1,1}^{ntimes n},$ which satisfy $(3)$ and $varepsilon_{ii}=1$, for all $i$. Also, there exist exactly $2^{n-1}$ matrices of the form $boldsymbol xcdotboldsymbol x^t$, where $boldsymbol x$ contains elements in ${-1,1}$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Nov 25 at 15:32

























          answered Nov 22 at 20:00









          Yiorgos S. Smyrlis

          62.3k1383162




          62.3k1383162








          • 1




            How do you arrive at (3)? Why does a product of $epsilon$'s appear?
            – daw
            Nov 22 at 20:36










          • @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
            – Yiorgos S. Smyrlis
            Nov 22 at 21:11










          • Can you please provide more detail? This is not very clear. The last part is not obvious.
            – independentvariable
            Nov 23 at 13:27










          • @independentvariable See my modified answer.
            – Yiorgos S. Smyrlis
            Nov 23 at 18:55










          • Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
            – independentvariable
            Nov 23 at 18:56
















          • 1




            How do you arrive at (3)? Why does a product of $epsilon$'s appear?
            – daw
            Nov 22 at 20:36










          • @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
            – Yiorgos S. Smyrlis
            Nov 22 at 21:11










          • Can you please provide more detail? This is not very clear. The last part is not obvious.
            – independentvariable
            Nov 23 at 13:27










          • @independentvariable See my modified answer.
            – Yiorgos S. Smyrlis
            Nov 23 at 18:55










          • Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
            – independentvariable
            Nov 23 at 18:56










          1




          1




          How do you arrive at (3)? Why does a product of $epsilon$'s appear?
          – daw
          Nov 22 at 20:36




          How do you arrive at (3)? Why does a product of $epsilon$'s appear?
          – daw
          Nov 22 at 20:36












          @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
          – Yiorgos S. Smyrlis
          Nov 22 at 21:11




          @daw To show that (3) is necessary for spd-ness, assume it does not hold, i.e., $varepsilon_{12}=-1$, $varepsilon_{23}=-1$, $varepsilon_{13}=-1$, and $boldsymbol x=(1,1,1)$, then $p(1,1,1)=-3$.
          – Yiorgos S. Smyrlis
          Nov 22 at 21:11












          Can you please provide more detail? This is not very clear. The last part is not obvious.
          – independentvariable
          Nov 23 at 13:27




          Can you please provide more detail? This is not very clear. The last part is not obvious.
          – independentvariable
          Nov 23 at 13:27












          @independentvariable See my modified answer.
          – Yiorgos S. Smyrlis
          Nov 23 at 18:55




          @independentvariable See my modified answer.
          – Yiorgos S. Smyrlis
          Nov 23 at 18:55












          Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
          – independentvariable
          Nov 23 at 18:56






          Ps: I believe (3) should hold for $i neq j neq k neq i$. Am I correct?
          – independentvariable
          Nov 23 at 18:56




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3009499%2fdecomposing-a-positive-semi-definite-matrix-with-all-1-1-elements%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Probability when a professor distributes a quiz and homework assignment to a class of n students.

          Aardman Animations

          Are they similar matrix