Proving That a Version of the Law of Total Probability Follows from Adam's Law
up vote
0
down vote
favorite
I have a homework question that asks:
Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:
$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$
[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]
Here is the proof I have written:
$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.
Let $E(I_A|X = x) = g(x)$, a function of x, then:
$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)
$= E(E(I_A|X)) = E(I_A)$ (by Adam's)
$= P(A)$ (by the bridge)
Therefore,
$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.
probability conditional-expectation conditional-probability
add a comment |
up vote
0
down vote
favorite
I have a homework question that asks:
Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:
$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$
[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]
Here is the proof I have written:
$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.
Let $E(I_A|X = x) = g(x)$, a function of x, then:
$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)
$= E(E(I_A|X)) = E(I_A)$ (by Adam's)
$= P(A)$ (by the bridge)
Therefore,
$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.
probability conditional-expectation conditional-probability
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have a homework question that asks:
Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:
$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$
[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]
Here is the proof I have written:
$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.
Let $E(I_A|X = x) = g(x)$, a function of x, then:
$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)
$= E(E(I_A|X)) = E(I_A)$ (by Adam's)
$= P(A)$ (by the bridge)
Therefore,
$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.
probability conditional-expectation conditional-probability
I have a homework question that asks:
Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:
$$ P(A) = int_{- infty}^{infty} P(A|X=x)f_X(x) dx $$
[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]
Here is the proof I have written:
$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.
Let $E(I_A|X = x) = g(x)$, a function of x, then:
$E(g(X)) = int_{- infty}^{infty} g(x)f_X(x) dx $ (This is a formula I found in my text)
$= E(E(I_A|X)) = E(I_A)$ (by Adam's)
$= P(A)$ (by the bridge)
Therefore,
$P(A) = int_{- infty}^{infty} E(I_A|X)f_X(x) dx = int_{- infty}^{infty} P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.
probability conditional-expectation conditional-probability
probability conditional-expectation conditional-probability
asked Nov 20 at 21:41
JStorm
255
255
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59
add a comment |
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Therefore,
$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.
It is not a concern.
You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.
$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$
In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Therefore,
$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.
It is not a concern.
You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.
$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$
In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .
add a comment |
up vote
1
down vote
accepted
Therefore,
$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.
It is not a concern.
You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.
$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$
In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Therefore,
$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.
It is not a concern.
You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.
$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$
In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .
Therefore,
$mathsf P(A) = int_{- infty}^{infty} mathsf E(I_A|X)f_X(x) dx = int_{- infty}^{infty} mathsf P(A|X)f_X(x) dx $
My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.
It is not a concern.
You are not using the indicator random variable, but its conditional exectation, $mathsf E(mathrm I_Amid X)$, and you actually want to use the function of $x$, $mathsf E(mathrm I_Amid X{=}x)$ inside the integral.
$$begin{align}mathsf P(A) &= mathsf E(mathrm I_A)\&=mathsf E(mathsf E(mathrm I_Amid X))\& = int_{-infty}^{infty} mathsf E(mathrm I_Amid X{=}x),f_X(x)~mathsf dx &~:~& mathsf E(g(X))=int_Bbb R g(x)~f_X(x)~mathsf d x \ & = int_{-infty}^{infty} mathsf P(Amid X{=}x),f_X(x)~mathsf dx end{align}$$
In short the Law of Total Probability Is: $mathsf P(A)=mathsf E(mathsf P(Amid X))$ .
answered Nov 20 at 23:24
Graham Kemp
84.6k43378
84.6k43378
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3006939%2fproving-that-a-version-of-the-law-of-total-probability-follows-from-adams-law%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Quite fine your argument, there is no problem with the characteristic functions. You might want to have a look at en.wikipedia.org/wiki/Conditional_expectation for some generalizations of what you proved.
– John B
Nov 20 at 21:59