Covariance proof












1












$begingroup$



Let the covariance random values X,Y be :
$$Cov[X, Y]= E[(X- E[X])(Y - E[Y])] = E[XY]-E[X]E[Y]$$



Prove the following for random values $X_1...X_n$ :
$$Var[sum_{i = 1}^n X_i] = sum_{i = 1}^n Var[X_i] + sum_{i = 1}^n sum_{j = 1, jne i}^nCov[X_i,X_j]$$




How would I go about proving this?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
    $endgroup$
    – Sean Roberson
    Dec 13 '18 at 4:00










  • $begingroup$
    @SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
    $endgroup$
    – drhab
    Dec 13 '18 at 14:15
















1












$begingroup$



Let the covariance random values X,Y be :
$$Cov[X, Y]= E[(X- E[X])(Y - E[Y])] = E[XY]-E[X]E[Y]$$



Prove the following for random values $X_1...X_n$ :
$$Var[sum_{i = 1}^n X_i] = sum_{i = 1}^n Var[X_i] + sum_{i = 1}^n sum_{j = 1, jne i}^nCov[X_i,X_j]$$




How would I go about proving this?










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
    $endgroup$
    – Sean Roberson
    Dec 13 '18 at 4:00










  • $begingroup$
    @SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
    $endgroup$
    – drhab
    Dec 13 '18 at 14:15














1












1








1





$begingroup$



Let the covariance random values X,Y be :
$$Cov[X, Y]= E[(X- E[X])(Y - E[Y])] = E[XY]-E[X]E[Y]$$



Prove the following for random values $X_1...X_n$ :
$$Var[sum_{i = 1}^n X_i] = sum_{i = 1}^n Var[X_i] + sum_{i = 1}^n sum_{j = 1, jne i}^nCov[X_i,X_j]$$




How would I go about proving this?










share|cite|improve this question









$endgroup$





Let the covariance random values X,Y be :
$$Cov[X, Y]= E[(X- E[X])(Y - E[Y])] = E[XY]-E[X]E[Y]$$



Prove the following for random values $X_1...X_n$ :
$$Var[sum_{i = 1}^n X_i] = sum_{i = 1}^n Var[X_i] + sum_{i = 1}^n sum_{j = 1, jne i}^nCov[X_i,X_j]$$




How would I go about proving this?







probability probability-theory discrete-mathematics






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 13 '18 at 3:58









J. LastinJ. Lastin

936




936








  • 1




    $begingroup$
    This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
    $endgroup$
    – Sean Roberson
    Dec 13 '18 at 4:00










  • $begingroup$
    @SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
    $endgroup$
    – drhab
    Dec 13 '18 at 14:15














  • 1




    $begingroup$
    This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
    $endgroup$
    – Sean Roberson
    Dec 13 '18 at 4:00










  • $begingroup$
    @SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
    $endgroup$
    – drhab
    Dec 13 '18 at 14:15








1




1




$begingroup$
This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
$endgroup$
– Sean Roberson
Dec 13 '18 at 4:00




$begingroup$
This is akin to expanding a squared sum. Can you try for two random variables and use inducton? That's my strat.
$endgroup$
– Sean Roberson
Dec 13 '18 at 4:00












$begingroup$
@SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
$endgroup$
– drhab
Dec 13 '18 at 14:15




$begingroup$
@SeanRoberson Induction can indeed be used, but I would not recommend it for this case.
$endgroup$
– drhab
Dec 13 '18 at 14:15










2 Answers
2






active

oldest

votes


















2












$begingroup$

We will use principle of mathematical induction.



Basis



$Var(X_1+X_2)=E((X_1+X_2)^2)-E^2(X_1+X_2)$



$=E(X_1^2+2X_1X_2+X_2^2)-(E(X_1)+E(X_2))^2$



$=E(X_1^2)-E^2(X_1)+E(X_2^2)-E^2(X_2)+2(E(X_1X_2)-E(X_1)E(X_2))$



$=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)$



Induction Hypothesis



$Var(sum_{i=1}^k X_i)=sum_{i=1}^k Var(X_i)+sum_{i=1}^k sum_{j=1,ineq j}^k Cov(X_i,X_j)$



Inductive step



$Var(sum_{i=1}^{k+1} X_i)=Var(X_{k+1}+sum_{i=1}^k X_i)$



$=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+ 2Cov(X_{k+1},sum_{i=1}^k X_i)$



$=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+2sum_{i=1}^k Cov(X_{k+1} X_i) $



which gives the required result on using hypothesis.



Hope it is helpful:)






share|cite|improve this answer











$endgroup$





















    2












    $begingroup$

    Let $Z_i=X_i-mathbb EX_i$ for $i=1,dots,n$.



    Then:



    $$mathsf{Var}left(sum_{i=1}^nX_iright)=mathbb Eleft(sum_{i=1}^nX_i-mathbb Esum_{i=1}^nX_iright)^2=mathbb Eleft(sum_{i=1}^nZ_iright)^2=mathbb Esum_{i=1}^nsum_{j=1}^nZ_iZ_j=$$$$sum_{i=1}^nmathbb EZ_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathbb EZ_iZ_j=sum_{i=1}^nmathsf{Var}X_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathsf{Cov}(X_i,X_j)$$






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037577%2fcovariance-proof%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      We will use principle of mathematical induction.



      Basis



      $Var(X_1+X_2)=E((X_1+X_2)^2)-E^2(X_1+X_2)$



      $=E(X_1^2+2X_1X_2+X_2^2)-(E(X_1)+E(X_2))^2$



      $=E(X_1^2)-E^2(X_1)+E(X_2^2)-E^2(X_2)+2(E(X_1X_2)-E(X_1)E(X_2))$



      $=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)$



      Induction Hypothesis



      $Var(sum_{i=1}^k X_i)=sum_{i=1}^k Var(X_i)+sum_{i=1}^k sum_{j=1,ineq j}^k Cov(X_i,X_j)$



      Inductive step



      $Var(sum_{i=1}^{k+1} X_i)=Var(X_{k+1}+sum_{i=1}^k X_i)$



      $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+ 2Cov(X_{k+1},sum_{i=1}^k X_i)$



      $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+2sum_{i=1}^k Cov(X_{k+1} X_i) $



      which gives the required result on using hypothesis.



      Hope it is helpful:)






      share|cite|improve this answer











      $endgroup$


















        2












        $begingroup$

        We will use principle of mathematical induction.



        Basis



        $Var(X_1+X_2)=E((X_1+X_2)^2)-E^2(X_1+X_2)$



        $=E(X_1^2+2X_1X_2+X_2^2)-(E(X_1)+E(X_2))^2$



        $=E(X_1^2)-E^2(X_1)+E(X_2^2)-E^2(X_2)+2(E(X_1X_2)-E(X_1)E(X_2))$



        $=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)$



        Induction Hypothesis



        $Var(sum_{i=1}^k X_i)=sum_{i=1}^k Var(X_i)+sum_{i=1}^k sum_{j=1,ineq j}^k Cov(X_i,X_j)$



        Inductive step



        $Var(sum_{i=1}^{k+1} X_i)=Var(X_{k+1}+sum_{i=1}^k X_i)$



        $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+ 2Cov(X_{k+1},sum_{i=1}^k X_i)$



        $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+2sum_{i=1}^k Cov(X_{k+1} X_i) $



        which gives the required result on using hypothesis.



        Hope it is helpful:)






        share|cite|improve this answer











        $endgroup$
















          2












          2








          2





          $begingroup$

          We will use principle of mathematical induction.



          Basis



          $Var(X_1+X_2)=E((X_1+X_2)^2)-E^2(X_1+X_2)$



          $=E(X_1^2+2X_1X_2+X_2^2)-(E(X_1)+E(X_2))^2$



          $=E(X_1^2)-E^2(X_1)+E(X_2^2)-E^2(X_2)+2(E(X_1X_2)-E(X_1)E(X_2))$



          $=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)$



          Induction Hypothesis



          $Var(sum_{i=1}^k X_i)=sum_{i=1}^k Var(X_i)+sum_{i=1}^k sum_{j=1,ineq j}^k Cov(X_i,X_j)$



          Inductive step



          $Var(sum_{i=1}^{k+1} X_i)=Var(X_{k+1}+sum_{i=1}^k X_i)$



          $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+ 2Cov(X_{k+1},sum_{i=1}^k X_i)$



          $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+2sum_{i=1}^k Cov(X_{k+1} X_i) $



          which gives the required result on using hypothesis.



          Hope it is helpful:)






          share|cite|improve this answer











          $endgroup$



          We will use principle of mathematical induction.



          Basis



          $Var(X_1+X_2)=E((X_1+X_2)^2)-E^2(X_1+X_2)$



          $=E(X_1^2+2X_1X_2+X_2^2)-(E(X_1)+E(X_2))^2$



          $=E(X_1^2)-E^2(X_1)+E(X_2^2)-E^2(X_2)+2(E(X_1X_2)-E(X_1)E(X_2))$



          $=Var(X_1)+Var(X_2)+2Cov(X_1,X_2)$



          Induction Hypothesis



          $Var(sum_{i=1}^k X_i)=sum_{i=1}^k Var(X_i)+sum_{i=1}^k sum_{j=1,ineq j}^k Cov(X_i,X_j)$



          Inductive step



          $Var(sum_{i=1}^{k+1} X_i)=Var(X_{k+1}+sum_{i=1}^k X_i)$



          $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+ 2Cov(X_{k+1},sum_{i=1}^k X_i)$



          $=Var(X_{k+1})+Var(sum_{i=1}^k X_i)+2sum_{i=1}^k Cov(X_{k+1} X_i) $



          which gives the required result on using hypothesis.



          Hope it is helpful:)







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 14 '18 at 0:16

























          answered Dec 13 '18 at 4:38









          MartundMartund

          1,633213




          1,633213























              2












              $begingroup$

              Let $Z_i=X_i-mathbb EX_i$ for $i=1,dots,n$.



              Then:



              $$mathsf{Var}left(sum_{i=1}^nX_iright)=mathbb Eleft(sum_{i=1}^nX_i-mathbb Esum_{i=1}^nX_iright)^2=mathbb Eleft(sum_{i=1}^nZ_iright)^2=mathbb Esum_{i=1}^nsum_{j=1}^nZ_iZ_j=$$$$sum_{i=1}^nmathbb EZ_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathbb EZ_iZ_j=sum_{i=1}^nmathsf{Var}X_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathsf{Cov}(X_i,X_j)$$






              share|cite|improve this answer









              $endgroup$


















                2












                $begingroup$

                Let $Z_i=X_i-mathbb EX_i$ for $i=1,dots,n$.



                Then:



                $$mathsf{Var}left(sum_{i=1}^nX_iright)=mathbb Eleft(sum_{i=1}^nX_i-mathbb Esum_{i=1}^nX_iright)^2=mathbb Eleft(sum_{i=1}^nZ_iright)^2=mathbb Esum_{i=1}^nsum_{j=1}^nZ_iZ_j=$$$$sum_{i=1}^nmathbb EZ_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathbb EZ_iZ_j=sum_{i=1}^nmathsf{Var}X_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathsf{Cov}(X_i,X_j)$$






                share|cite|improve this answer









                $endgroup$
















                  2












                  2








                  2





                  $begingroup$

                  Let $Z_i=X_i-mathbb EX_i$ for $i=1,dots,n$.



                  Then:



                  $$mathsf{Var}left(sum_{i=1}^nX_iright)=mathbb Eleft(sum_{i=1}^nX_i-mathbb Esum_{i=1}^nX_iright)^2=mathbb Eleft(sum_{i=1}^nZ_iright)^2=mathbb Esum_{i=1}^nsum_{j=1}^nZ_iZ_j=$$$$sum_{i=1}^nmathbb EZ_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathbb EZ_iZ_j=sum_{i=1}^nmathsf{Var}X_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathsf{Cov}(X_i,X_j)$$






                  share|cite|improve this answer









                  $endgroup$



                  Let $Z_i=X_i-mathbb EX_i$ for $i=1,dots,n$.



                  Then:



                  $$mathsf{Var}left(sum_{i=1}^nX_iright)=mathbb Eleft(sum_{i=1}^nX_i-mathbb Esum_{i=1}^nX_iright)^2=mathbb Eleft(sum_{i=1}^nZ_iright)^2=mathbb Esum_{i=1}^nsum_{j=1}^nZ_iZ_j=$$$$sum_{i=1}^nmathbb EZ_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathbb EZ_iZ_j=sum_{i=1}^nmathsf{Var}X_i^2+sum_{i=1}^nsum_{j=1,jneq i}^nmathsf{Cov}(X_i,X_j)$$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Dec 13 '18 at 14:09









                  drhabdrhab

                  101k545136




                  101k545136






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3037577%2fcovariance-proof%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Probability when a professor distributes a quiz and homework assignment to a class of n students.

                      Aardman Animations

                      Are they similar matrix