Explain why $E(X) = int_0^infty (1-F_X (t)) , dt$ for every nonnegative random variable $X$












25












$begingroup$



Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show,
$$E(X) = int_0^infty (1-F_X (t)) , dt$$
when $X$ has : a) a discrete distribution, b) a continuous distribution.




I assumed that for the case of a continuous distribution, since $F_X (t) = mathbb{P}(Xleq t)$, then $1-F_X (t) = 1- mathbb{P}(Xleq t) = mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
    $endgroup$
    – Davide Giraudo
    Jul 19 '12 at 13:42






  • 3




    $begingroup$
    This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
    $endgroup$
    – Michael Chernick
    Jul 19 '12 at 14:21






  • 2




    $begingroup$
    See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
    $endgroup$
    – Dilip Sarwate
    Jul 19 '12 at 15:38






  • 2




    $begingroup$
    As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
    $endgroup$
    – cantorhead
    Oct 26 '15 at 21:12
















25












$begingroup$



Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show,
$$E(X) = int_0^infty (1-F_X (t)) , dt$$
when $X$ has : a) a discrete distribution, b) a continuous distribution.




I assumed that for the case of a continuous distribution, since $F_X (t) = mathbb{P}(Xleq t)$, then $1-F_X (t) = 1- mathbb{P}(Xleq t) = mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
    $endgroup$
    – Davide Giraudo
    Jul 19 '12 at 13:42






  • 3




    $begingroup$
    This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
    $endgroup$
    – Michael Chernick
    Jul 19 '12 at 14:21






  • 2




    $begingroup$
    See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
    $endgroup$
    – Dilip Sarwate
    Jul 19 '12 at 15:38






  • 2




    $begingroup$
    As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
    $endgroup$
    – cantorhead
    Oct 26 '15 at 21:12














25












25








25


26



$begingroup$



Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show,
$$E(X) = int_0^infty (1-F_X (t)) , dt$$
when $X$ has : a) a discrete distribution, b) a continuous distribution.




I assumed that for the case of a continuous distribution, since $F_X (t) = mathbb{P}(Xleq t)$, then $1-F_X (t) = 1- mathbb{P}(Xleq t) = mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.










share|cite|improve this question











$endgroup$





Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show,
$$E(X) = int_0^infty (1-F_X (t)) , dt$$
when $X$ has : a) a discrete distribution, b) a continuous distribution.




I assumed that for the case of a continuous distribution, since $F_X (t) = mathbb{P}(Xleq t)$, then $1-F_X (t) = 1- mathbb{P}(Xleq t) = mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.







probability probability-theory expected-value faq






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 13 '18 at 11:55









Lee David Chung Lin

4,24531141




4,24531141










asked Jul 19 '12 at 13:37









mercurialmercurial

5762714




5762714








  • 1




    $begingroup$
    In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
    $endgroup$
    – Davide Giraudo
    Jul 19 '12 at 13:42






  • 3




    $begingroup$
    This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
    $endgroup$
    – Michael Chernick
    Jul 19 '12 at 14:21






  • 2




    $begingroup$
    See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
    $endgroup$
    – Dilip Sarwate
    Jul 19 '12 at 15:38






  • 2




    $begingroup$
    As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
    $endgroup$
    – cantorhead
    Oct 26 '15 at 21:12














  • 1




    $begingroup$
    In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
    $endgroup$
    – Davide Giraudo
    Jul 19 '12 at 13:42






  • 3




    $begingroup$
    This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
    $endgroup$
    – Michael Chernick
    Jul 19 '12 at 14:21






  • 2




    $begingroup$
    See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
    $endgroup$
    – Dilip Sarwate
    Jul 19 '12 at 15:38






  • 2




    $begingroup$
    As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
    $endgroup$
    – cantorhead
    Oct 26 '15 at 21:12








1




1




$begingroup$
In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
$endgroup$
– Davide Giraudo
Jul 19 '12 at 13:42




$begingroup$
In the two cases, it's a rewritting of the sum. Start from the RHS, that you can express in the first case as an integral of a sum and in the second as a double integral, then switch them. This is allowed because all the quantities are non-negative.
$endgroup$
– Davide Giraudo
Jul 19 '12 at 13:42




3




3




$begingroup$
This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
$endgroup$
– Michael Chernick
Jul 19 '12 at 14:21




$begingroup$
This question was asked here previously. Check and you will find a more detailed answer. Either here or on CV.
$endgroup$
– Michael Chernick
Jul 19 '12 at 14:21




2




2




$begingroup$
See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
$endgroup$
– Dilip Sarwate
Jul 19 '12 at 15:38




$begingroup$
See for example, the answers to this question which include both formal proofs (by Didier, who has answered your question here) as well as more intuitive approaches to the problem.
$endgroup$
– Dilip Sarwate
Jul 19 '12 at 15:38




2




2




$begingroup$
As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
$endgroup$
– cantorhead
Oct 26 '15 at 21:12




$begingroup$
As far as usefulness, this can be more numerically stable than differentiating $F$, mulitplying by $t$, and integrating. Actually, most random variables don't have pdfs, so differentiating $F$ may not even be possible.
$endgroup$
– cantorhead
Oct 26 '15 at 21:12










3 Answers
3






active

oldest

votes


















27












$begingroup$

For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
$$
X=int_0^Xmathrm dt=int_0^{+infty}mathbf 1_{Xgt t},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},mathrm dt,
$$
hence




$$
mathrm E(X)=int_0^{+infty}mathrm P(Xgt t),mathrm dt=int_0^{+infty}mathrm P(Xgeqslant t),mathrm dt.
$$






Likewise, for every $p>0$, $$
X^p=int_0^Xp,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgt t},p,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},p,t^{p-1},mathrm dt,
$$
hence




$$
mathrm E(X^p)=int_0^{+infty}p,t^{p-1},mathrm P(Xgt t),mathrm dt=int_0^{+infty}p,t^{p-1},mathrm P(Xgeqslant t),mathrm dt.
$$







share|cite|improve this answer











$endgroup$













  • $begingroup$
    may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
    $endgroup$
    – Cupitor
    Jun 2 '14 at 15:26








  • 2




    $begingroup$
    @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
    $endgroup$
    – Did
    Jun 2 '14 at 19:35






  • 4




    $begingroup$
    $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
    $endgroup$
    – Did
    Jun 3 '14 at 10:09








  • 2




    $begingroup$
    The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
    $endgroup$
    – Did
    Jun 3 '14 at 15:14






  • 1




    $begingroup$
    @see Yes, your reading of these formulas and the proof in your first comment are both correct.
    $endgroup$
    – Did
    Feb 12 '17 at 21:35





















9












$begingroup$

Copied from Cross Validated / stats.stackexchange:



enter image description here



where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.






share|cite|improve this answer











$endgroup$





















    -3












    $begingroup$

    Another way is that we know: $X=F^{-1}(U)$ where $F$ is the CDF of $X$. So the expected value will be $$int_{0}^{1} F^{-1}(U) 1 du.$$ If we look at this region, we notice that it is equivalent to the area above the CDF bounded by 1. So we get $$int_{0}^{1} F^{-1}(U) 1 du = int_{-infty}^{infty} (1-F(U)) du = int_{-infty}^{infty} P(X geq x) dx$$






    share|cite|improve this answer











    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f172841%2fexplain-why-ex-int-0-infty-1-f-x-t-dt-for-every-nonnegative-rando%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      27












      $begingroup$

      For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
      $$
      X=int_0^Xmathrm dt=int_0^{+infty}mathbf 1_{Xgt t},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},mathrm dt,
      $$
      hence




      $$
      mathrm E(X)=int_0^{+infty}mathrm P(Xgt t),mathrm dt=int_0^{+infty}mathrm P(Xgeqslant t),mathrm dt.
      $$






      Likewise, for every $p>0$, $$
      X^p=int_0^Xp,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgt t},p,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},p,t^{p-1},mathrm dt,
      $$
      hence




      $$
      mathrm E(X^p)=int_0^{+infty}p,t^{p-1},mathrm P(Xgt t),mathrm dt=int_0^{+infty}p,t^{p-1},mathrm P(Xgeqslant t),mathrm dt.
      $$







      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
        $endgroup$
        – Cupitor
        Jun 2 '14 at 15:26








      • 2




        $begingroup$
        @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
        $endgroup$
        – Did
        Jun 2 '14 at 19:35






      • 4




        $begingroup$
        $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
        $endgroup$
        – Did
        Jun 3 '14 at 10:09








      • 2




        $begingroup$
        The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
        $endgroup$
        – Did
        Jun 3 '14 at 15:14






      • 1




        $begingroup$
        @see Yes, your reading of these formulas and the proof in your first comment are both correct.
        $endgroup$
        – Did
        Feb 12 '17 at 21:35


















      27












      $begingroup$

      For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
      $$
      X=int_0^Xmathrm dt=int_0^{+infty}mathbf 1_{Xgt t},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},mathrm dt,
      $$
      hence




      $$
      mathrm E(X)=int_0^{+infty}mathrm P(Xgt t),mathrm dt=int_0^{+infty}mathrm P(Xgeqslant t),mathrm dt.
      $$






      Likewise, for every $p>0$, $$
      X^p=int_0^Xp,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgt t},p,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},p,t^{p-1},mathrm dt,
      $$
      hence




      $$
      mathrm E(X^p)=int_0^{+infty}p,t^{p-1},mathrm P(Xgt t),mathrm dt=int_0^{+infty}p,t^{p-1},mathrm P(Xgeqslant t),mathrm dt.
      $$







      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
        $endgroup$
        – Cupitor
        Jun 2 '14 at 15:26








      • 2




        $begingroup$
        @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
        $endgroup$
        – Did
        Jun 2 '14 at 19:35






      • 4




        $begingroup$
        $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
        $endgroup$
        – Did
        Jun 3 '14 at 10:09








      • 2




        $begingroup$
        The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
        $endgroup$
        – Did
        Jun 3 '14 at 15:14






      • 1




        $begingroup$
        @see Yes, your reading of these formulas and the proof in your first comment are both correct.
        $endgroup$
        – Did
        Feb 12 '17 at 21:35
















      27












      27








      27





      $begingroup$

      For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
      $$
      X=int_0^Xmathrm dt=int_0^{+infty}mathbf 1_{Xgt t},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},mathrm dt,
      $$
      hence




      $$
      mathrm E(X)=int_0^{+infty}mathrm P(Xgt t),mathrm dt=int_0^{+infty}mathrm P(Xgeqslant t),mathrm dt.
      $$






      Likewise, for every $p>0$, $$
      X^p=int_0^Xp,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgt t},p,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},p,t^{p-1},mathrm dt,
      $$
      hence




      $$
      mathrm E(X^p)=int_0^{+infty}p,t^{p-1},mathrm P(Xgt t),mathrm dt=int_0^{+infty}p,t^{p-1},mathrm P(Xgeqslant t),mathrm dt.
      $$







      share|cite|improve this answer











      $endgroup$



      For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
      $$
      X=int_0^Xmathrm dt=int_0^{+infty}mathbf 1_{Xgt t},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},mathrm dt,
      $$
      hence




      $$
      mathrm E(X)=int_0^{+infty}mathrm P(Xgt t),mathrm dt=int_0^{+infty}mathrm P(Xgeqslant t),mathrm dt.
      $$






      Likewise, for every $p>0$, $$
      X^p=int_0^Xp,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgt t},p,t^{p-1},mathrm dt=int_0^{+infty}mathbf 1_{Xgeqslant t},p,t^{p-1},mathrm dt,
      $$
      hence




      $$
      mathrm E(X^p)=int_0^{+infty}p,t^{p-1},mathrm P(Xgt t),mathrm dt=int_0^{+infty}p,t^{p-1},mathrm P(Xgeqslant t),mathrm dt.
      $$








      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited May 21 '17 at 13:52

























      answered Jul 19 '12 at 14:28









      DidDid

      248k23223460




      248k23223460












      • $begingroup$
        may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
        $endgroup$
        – Cupitor
        Jun 2 '14 at 15:26








      • 2




        $begingroup$
        @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
        $endgroup$
        – Did
        Jun 2 '14 at 19:35






      • 4




        $begingroup$
        $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
        $endgroup$
        – Did
        Jun 3 '14 at 10:09








      • 2




        $begingroup$
        The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
        $endgroup$
        – Did
        Jun 3 '14 at 15:14






      • 1




        $begingroup$
        @see Yes, your reading of these formulas and the proof in your first comment are both correct.
        $endgroup$
        – Did
        Feb 12 '17 at 21:35




















      • $begingroup$
        may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
        $endgroup$
        – Cupitor
        Jun 2 '14 at 15:26








      • 2




        $begingroup$
        @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
        $endgroup$
        – Did
        Jun 2 '14 at 19:35






      • 4




        $begingroup$
        $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
        $endgroup$
        – Did
        Jun 3 '14 at 10:09








      • 2




        $begingroup$
        The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
        $endgroup$
        – Did
        Jun 3 '14 at 15:14






      • 1




        $begingroup$
        @see Yes, your reading of these formulas and the proof in your first comment are both correct.
        $endgroup$
        – Did
        Feb 12 '17 at 21:35


















      $begingroup$
      may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
      $endgroup$
      – Cupitor
      Jun 2 '14 at 15:26






      $begingroup$
      may I ask you how you derive the first equation? The left side is a function from sample space (possibly to $mathbb{R}$) and the right side is an integral and therefore a number. Am I right?
      $endgroup$
      – Cupitor
      Jun 2 '14 at 15:26






      2




      2




      $begingroup$
      @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
      $endgroup$
      – Did
      Jun 2 '14 at 19:35




      $begingroup$
      @Cupitor The left-hand-side, the middle side and the right-hand-side are all random variables, for example the value at $omega$ of the right-hand-side is $$int_0^{+infty}mathbf 1_{X(omega)geqslant t},mathrm dt.$$
      $endgroup$
      – Did
      Jun 2 '14 at 19:35




      4




      4




      $begingroup$
      $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
      $endgroup$
      – Did
      Jun 3 '14 at 10:09






      $begingroup$
      $U=mathbf 1_{Xgeqslant t}$ is the function defined on $Omega$ by $U(omega)=1$ if $X(omega)geqslant t$ and $U(omega)=0$ otherwise.
      $endgroup$
      – Did
      Jun 3 '14 at 10:09






      2




      2




      $begingroup$
      The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
      $endgroup$
      – Did
      Jun 3 '14 at 15:14




      $begingroup$
      The second step is to consider the expectation of each side (that is, its integral with respect to $P$).
      $endgroup$
      – Did
      Jun 3 '14 at 15:14




      1




      1




      $begingroup$
      @see Yes, your reading of these formulas and the proof in your first comment are both correct.
      $endgroup$
      – Did
      Feb 12 '17 at 21:35






      $begingroup$
      @see Yes, your reading of these formulas and the proof in your first comment are both correct.
      $endgroup$
      – Did
      Feb 12 '17 at 21:35













      9












      $begingroup$

      Copied from Cross Validated / stats.stackexchange:



      enter image description here



      where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.






      share|cite|improve this answer











      $endgroup$


















        9












        $begingroup$

        Copied from Cross Validated / stats.stackexchange:



        enter image description here



        where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.






        share|cite|improve this answer











        $endgroup$
















          9












          9








          9





          $begingroup$

          Copied from Cross Validated / stats.stackexchange:



          enter image description here



          where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.






          share|cite|improve this answer











          $endgroup$



          Copied from Cross Validated / stats.stackexchange:



          enter image description here



          where $S(t)$ is the survival function equal to $1- F(t)$. The two areas are clearly identical.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Apr 13 '17 at 12:44









          Community

          1




          1










          answered Jul 19 '12 at 17:53









          HenryHenry

          100k480165




          100k480165























              -3












              $begingroup$

              Another way is that we know: $X=F^{-1}(U)$ where $F$ is the CDF of $X$. So the expected value will be $$int_{0}^{1} F^{-1}(U) 1 du.$$ If we look at this region, we notice that it is equivalent to the area above the CDF bounded by 1. So we get $$int_{0}^{1} F^{-1}(U) 1 du = int_{-infty}^{infty} (1-F(U)) du = int_{-infty}^{infty} P(X geq x) dx$$






              share|cite|improve this answer











              $endgroup$


















                -3












                $begingroup$

                Another way is that we know: $X=F^{-1}(U)$ where $F$ is the CDF of $X$. So the expected value will be $$int_{0}^{1} F^{-1}(U) 1 du.$$ If we look at this region, we notice that it is equivalent to the area above the CDF bounded by 1. So we get $$int_{0}^{1} F^{-1}(U) 1 du = int_{-infty}^{infty} (1-F(U)) du = int_{-infty}^{infty} P(X geq x) dx$$






                share|cite|improve this answer











                $endgroup$
















                  -3












                  -3








                  -3





                  $begingroup$

                  Another way is that we know: $X=F^{-1}(U)$ where $F$ is the CDF of $X$. So the expected value will be $$int_{0}^{1} F^{-1}(U) 1 du.$$ If we look at this region, we notice that it is equivalent to the area above the CDF bounded by 1. So we get $$int_{0}^{1} F^{-1}(U) 1 du = int_{-infty}^{infty} (1-F(U)) du = int_{-infty}^{infty} P(X geq x) dx$$






                  share|cite|improve this answer











                  $endgroup$



                  Another way is that we know: $X=F^{-1}(U)$ where $F$ is the CDF of $X$. So the expected value will be $$int_{0}^{1} F^{-1}(U) 1 du.$$ If we look at this region, we notice that it is equivalent to the area above the CDF bounded by 1. So we get $$int_{0}^{1} F^{-1}(U) 1 du = int_{-infty}^{infty} (1-F(U)) du = int_{-infty}^{infty} P(X geq x) dx$$







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 11 '18 at 23:51









                  Xander Henderson

                  14.4k103554




                  14.4k103554










                  answered Dec 11 '18 at 23:33









                  CodingWolfCodingWolf

                  4319




                  4319






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f172841%2fexplain-why-ex-int-0-infty-1-f-x-t-dt-for-every-nonnegative-rando%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Probability when a professor distributes a quiz and homework assignment to a class of n students.

                      Aardman Animations

                      Are they similar matrix