Proving That a Version of the Law of Total Probability Follows from Adam's Law











up vote
0
down vote

favorite












I have a homework question that asks:



Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:



$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$



[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]



Here is the proof I have written:



$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.



Let $E(I_A|X = x) = g(x)$, a function of x, then:



$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)



$= E(E(I_A|X)) = E(I_A)$ (by Adam's)



$= P(A)$ (by the bridge)



Therefore,



$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $



My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.










share|cite|improve this question






















  • Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
    – John B
    Nov 20 at 21:59

















up vote
0
down vote

favorite












I have a homework question that asks:



Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:



$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$



[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]



Here is the proof I have written:



$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.



Let $E(I_A|X = x) = g(x)$, a function of x, then:



$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)



$= E(E(I_A|X)) = E(I_A)$ (by Adam's)



$= P(A)$ (by the bridge)



Therefore,



$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $



My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.










share|cite|improve this question






















  • Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
    – John B
    Nov 20 at 21:59















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have a homework question that asks:



Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:



$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$



[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]



Here is the proof I have written:



$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.



Let $E(I_A|X = x) = g(x)$, a function of x, then:



$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)



$= E(E(I_A|X)) = E(I_A)$ (by Adam's)



$= P(A)$ (by the bridge)



Therefore,



$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $



My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.










share|cite|improve this question













I have a homework question that asks:



Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:



$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$



[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]



Here is the proof I have written:



$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.



Let $E(I_A|X = x) = g(x)$, a function of x, then:



$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)



$= E(E(I_A|X)) = E(I_A)$ (by Adam's)



$= P(A)$ (by the bridge)



Therefore,



$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $



My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.







probability conditional-expectation conditional-probability






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 20 at 21:41









JStorm

255




255












  • Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
    – John B
    Nov 20 at 21:59




















  • Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
    – John B
    Nov 20 at 21:59


















Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59






Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted











Therefore,



$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $



My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.




It is not a concern.



You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.



$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$



In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3006939%2fproving-that-a-version-of-the-law-of-total-probability-follows-from-adams-law%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted











    Therefore,



    $mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $



    My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.




    It is not a concern.



    You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.



    $$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$



    In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .






    share|cite|improve this answer

























      up vote
      1
      down vote



      accepted











      Therefore,



      $mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $



      My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.




      It is not a concern.



      You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.



      $$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$



      In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .






      share|cite|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted







        Therefore,



        $mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $



        My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.




        It is not a concern.



        You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.



        $$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$



        In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .






        share|cite|improve this answer













        Therefore,



        $mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $



        My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.




        It is not a concern.



        You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.



        $$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$



        In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Nov 20 at 23:24









        Graham Kemp

        84.6k43378




        84.6k43378






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3006939%2fproving-that-a-version-of-the-law-of-total-probability-follows-from-adams-law%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Probability when a professor distributes a quiz and homework assignment to a class of n students.

            Aardman Animations

            Are they similar matrix