Discrete Probability: Expected Value and Random Variable independence [closed]
$begingroup$
For this, I took n=2 which makes the set: {1,2,3,4}
Set will contain: {C1,C2,B1,B2}
X = 1 if the position of first cider bottle is 1
P(X=1) = 6/24 = 1/4
E(X) = 2 * 1/4 = 1/2
The general form will be: n*1/2n = 1/n.
This is my attempt, I'm not sure if I'm correct on this.
For this question:
You roll a fair die repeatedly and independently until the result is an
even number. Define the random variables
X = the number of times you roll the die
and
Y = the result of the last roll.
For example, if the results of the rolls are 5; 1; 3; 3; 5; 2, then X = 6 and
Y = 2.
Prove that the random variables X and Y are independent.
I defined X = 1 if number of times roll die is 1 time
and Y =1 if result of last roll is even
So, Pr(X) = 3/6 = 1/2 = Pr(Y)
Pr(X and Y) = 1/2
This gives me 1/2 = 1/4 which is not independent but the question is asking to prove independence
probability probability-theory discrete-mathematics random-variables expected-value
$endgroup$
closed as unclear what you're asking by Did, NCh, Leucippus, Tianlalu, KReiser Dec 19 '18 at 7:10
Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
$begingroup$
For this, I took n=2 which makes the set: {1,2,3,4}
Set will contain: {C1,C2,B1,B2}
X = 1 if the position of first cider bottle is 1
P(X=1) = 6/24 = 1/4
E(X) = 2 * 1/4 = 1/2
The general form will be: n*1/2n = 1/n.
This is my attempt, I'm not sure if I'm correct on this.
For this question:
You roll a fair die repeatedly and independently until the result is an
even number. Define the random variables
X = the number of times you roll the die
and
Y = the result of the last roll.
For example, if the results of the rolls are 5; 1; 3; 3; 5; 2, then X = 6 and
Y = 2.
Prove that the random variables X and Y are independent.
I defined X = 1 if number of times roll die is 1 time
and Y =1 if result of last roll is even
So, Pr(X) = 3/6 = 1/2 = Pr(Y)
Pr(X and Y) = 1/2
This gives me 1/2 = 1/4 which is not independent but the question is asking to prove independence
probability probability-theory discrete-mathematics random-variables expected-value
$endgroup$
closed as unclear what you're asking by Did, NCh, Leucippus, Tianlalu, KReiser Dec 19 '18 at 7:10
Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
1
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22
add a comment |
$begingroup$
For this, I took n=2 which makes the set: {1,2,3,4}
Set will contain: {C1,C2,B1,B2}
X = 1 if the position of first cider bottle is 1
P(X=1) = 6/24 = 1/4
E(X) = 2 * 1/4 = 1/2
The general form will be: n*1/2n = 1/n.
This is my attempt, I'm not sure if I'm correct on this.
For this question:
You roll a fair die repeatedly and independently until the result is an
even number. Define the random variables
X = the number of times you roll the die
and
Y = the result of the last roll.
For example, if the results of the rolls are 5; 1; 3; 3; 5; 2, then X = 6 and
Y = 2.
Prove that the random variables X and Y are independent.
I defined X = 1 if number of times roll die is 1 time
and Y =1 if result of last roll is even
So, Pr(X) = 3/6 = 1/2 = Pr(Y)
Pr(X and Y) = 1/2
This gives me 1/2 = 1/4 which is not independent but the question is asking to prove independence
probability probability-theory discrete-mathematics random-variables expected-value
$endgroup$
For this, I took n=2 which makes the set: {1,2,3,4}
Set will contain: {C1,C2,B1,B2}
X = 1 if the position of first cider bottle is 1
P(X=1) = 6/24 = 1/4
E(X) = 2 * 1/4 = 1/2
The general form will be: n*1/2n = 1/n.
This is my attempt, I'm not sure if I'm correct on this.
For this question:
You roll a fair die repeatedly and independently until the result is an
even number. Define the random variables
X = the number of times you roll the die
and
Y = the result of the last roll.
For example, if the results of the rolls are 5; 1; 3; 3; 5; 2, then X = 6 and
Y = 2.
Prove that the random variables X and Y are independent.
I defined X = 1 if number of times roll die is 1 time
and Y =1 if result of last roll is even
So, Pr(X) = 3/6 = 1/2 = Pr(Y)
Pr(X and Y) = 1/2
This gives me 1/2 = 1/4 which is not independent but the question is asking to prove independence
probability probability-theory discrete-mathematics random-variables expected-value
probability probability-theory discrete-mathematics random-variables expected-value
asked Dec 17 '18 at 6:18
TobyToby
1577
1577
closed as unclear what you're asking by Did, NCh, Leucippus, Tianlalu, KReiser Dec 19 '18 at 7:10
Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
closed as unclear what you're asking by Did, NCh, Leucippus, Tianlalu, KReiser Dec 19 '18 at 7:10
Please clarify your specific problem or add additional details to highlight exactly what you need. As it's currently written, it’s hard to tell exactly what you're asking. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.
1
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22
add a comment |
1
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22
1
1
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
${X=1}$ is the event that the first bottle is a cider bottle.
Probability on that: $P(text{first cider})=frac{n}{n+2}$
${X=2}$ is the event that the first bottle contains beer and the second bottle contains cider.
Probability on that: $P(text{first beer})P(text{second cider}midtext{ first beer})=frac2{n+2}frac{n}{n+1}$.
${X=3}$ is the event that the first bottle contains beer and the second bottle contains beer.
Probability on that: $P(text{first beer})P(text{second beer}midtext{ first beer})=frac2{n+2}frac{1}{n+1}$.
Now we are ready to find:$$mathbb EX=P(X=1)+2P(X=2)+3P(X=3)=frac{n}{n+2}+2frac2{n+2}frac{n}{n+1}+3frac2{n+2}frac1{n+1}=frac{n+3}{n+1}$$
There are $2$ bottles that have index $1$ so that $P(Y=1)=frac2{n+2}$.
${X=1,Y=1}$ is the event that the first bottle is the cider bottle with index $1$.
Probability on that: $P(X=1,Y=1)=frac1{n+2}$.
So a necessary condition for independence is: $$frac{n}{n+2}frac2{n+2}=P(X=1)P(Y=1)=P(X=1,Y=1)=frac1{n+2}$$
leading to $n=2$.
So we conclude that there is no independence if $n>3$ and there might be independence if $n=2$. To verify we must check for that case whether $P(X=i)(Y=j)=P(X=i,Y=j)$ for $i,jin{1,2}$.
I leave that to you.
$endgroup$
add a comment |
$begingroup$
For the first part of the first problem, $X$ can take only values $1,2,$ and $3$.
For $X = 1$
$C_i ---------------$ in the first place and the rest can be filled with $2B_i$s and $(n-1) C_i$s.
For X =2
$B_i C_i ---------------$ one of the $B_i$s in the first place, $C_i$ in the second place and the rest can be filled with the remaining $B_i$s and $(n-1) C_i$s
For X = 3
$B_iB_i ----------------- $ Both the $B_i$s should occupy the first two places and the rest can be filled with the remaining $C_i$s.
Number of ways X = 1 can happen is = ${nchoose1} (n+1)!$
Number of ways X = 2 can happen is = ${2choose1}{nchoose1} n!$
Number of ways X = 3 can happen is = ${2choose1} n!$
Total number of ways =$(n+2)!$
Sanity check to see if $P(X=1)+P(X=2)+P(X=3) = 1$
$$frac{(n(n+1)! + 2nn! + 2n!)}{(n+2)!} = 1$$
Thus the expected value $$ E(X) = frac{1.{nchoose1}(n+1)!+2.{2choose1}{nchoose1} n!+3.{2choose1} n!}{(n+2)!} = frac{n+3}{n+1}$$
Second Part
For $Y = 1$
$(B_1) ---------------$ $B_1$in the first place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of (n+1)! ways
$(C_1) ---------------$ $C_1$in the first place and the rest can be filled with the other $B_i$s and $(n-1)C_i$s to a total of (n+1)! ways.
Thus $P(Y=1) = frac{(2(n+1)!)}{(n+2)!}$.
For Y =2
$-(B_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $B_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
$-(C_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $C_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
Thus$P(Y=2) = frac{(2n) n!}{(n+2)!}$
and so on for (Y=n+1) for which the probability = $P(Y=n+1) = frac{(2) n!}{(n+2)!}$
Thus $$E(Y) = frac{2(n+1).n! times 1 + 2n.n!times 2 + 2(n-1)n!times 3 +cdots + 2(2).n!times n+ 2(1).n!times (n+1)}{(n+2)!} $$ $$= frac{(n+3)}{3}$$
$endgroup$
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
${X=1}$ is the event that the first bottle is a cider bottle.
Probability on that: $P(text{first cider})=frac{n}{n+2}$
${X=2}$ is the event that the first bottle contains beer and the second bottle contains cider.
Probability on that: $P(text{first beer})P(text{second cider}midtext{ first beer})=frac2{n+2}frac{n}{n+1}$.
${X=3}$ is the event that the first bottle contains beer and the second bottle contains beer.
Probability on that: $P(text{first beer})P(text{second beer}midtext{ first beer})=frac2{n+2}frac{1}{n+1}$.
Now we are ready to find:$$mathbb EX=P(X=1)+2P(X=2)+3P(X=3)=frac{n}{n+2}+2frac2{n+2}frac{n}{n+1}+3frac2{n+2}frac1{n+1}=frac{n+3}{n+1}$$
There are $2$ bottles that have index $1$ so that $P(Y=1)=frac2{n+2}$.
${X=1,Y=1}$ is the event that the first bottle is the cider bottle with index $1$.
Probability on that: $P(X=1,Y=1)=frac1{n+2}$.
So a necessary condition for independence is: $$frac{n}{n+2}frac2{n+2}=P(X=1)P(Y=1)=P(X=1,Y=1)=frac1{n+2}$$
leading to $n=2$.
So we conclude that there is no independence if $n>3$ and there might be independence if $n=2$. To verify we must check for that case whether $P(X=i)(Y=j)=P(X=i,Y=j)$ for $i,jin{1,2}$.
I leave that to you.
$endgroup$
add a comment |
$begingroup$
${X=1}$ is the event that the first bottle is a cider bottle.
Probability on that: $P(text{first cider})=frac{n}{n+2}$
${X=2}$ is the event that the first bottle contains beer and the second bottle contains cider.
Probability on that: $P(text{first beer})P(text{second cider}midtext{ first beer})=frac2{n+2}frac{n}{n+1}$.
${X=3}$ is the event that the first bottle contains beer and the second bottle contains beer.
Probability on that: $P(text{first beer})P(text{second beer}midtext{ first beer})=frac2{n+2}frac{1}{n+1}$.
Now we are ready to find:$$mathbb EX=P(X=1)+2P(X=2)+3P(X=3)=frac{n}{n+2}+2frac2{n+2}frac{n}{n+1}+3frac2{n+2}frac1{n+1}=frac{n+3}{n+1}$$
There are $2$ bottles that have index $1$ so that $P(Y=1)=frac2{n+2}$.
${X=1,Y=1}$ is the event that the first bottle is the cider bottle with index $1$.
Probability on that: $P(X=1,Y=1)=frac1{n+2}$.
So a necessary condition for independence is: $$frac{n}{n+2}frac2{n+2}=P(X=1)P(Y=1)=P(X=1,Y=1)=frac1{n+2}$$
leading to $n=2$.
So we conclude that there is no independence if $n>3$ and there might be independence if $n=2$. To verify we must check for that case whether $P(X=i)(Y=j)=P(X=i,Y=j)$ for $i,jin{1,2}$.
I leave that to you.
$endgroup$
add a comment |
$begingroup$
${X=1}$ is the event that the first bottle is a cider bottle.
Probability on that: $P(text{first cider})=frac{n}{n+2}$
${X=2}$ is the event that the first bottle contains beer and the second bottle contains cider.
Probability on that: $P(text{first beer})P(text{second cider}midtext{ first beer})=frac2{n+2}frac{n}{n+1}$.
${X=3}$ is the event that the first bottle contains beer and the second bottle contains beer.
Probability on that: $P(text{first beer})P(text{second beer}midtext{ first beer})=frac2{n+2}frac{1}{n+1}$.
Now we are ready to find:$$mathbb EX=P(X=1)+2P(X=2)+3P(X=3)=frac{n}{n+2}+2frac2{n+2}frac{n}{n+1}+3frac2{n+2}frac1{n+1}=frac{n+3}{n+1}$$
There are $2$ bottles that have index $1$ so that $P(Y=1)=frac2{n+2}$.
${X=1,Y=1}$ is the event that the first bottle is the cider bottle with index $1$.
Probability on that: $P(X=1,Y=1)=frac1{n+2}$.
So a necessary condition for independence is: $$frac{n}{n+2}frac2{n+2}=P(X=1)P(Y=1)=P(X=1,Y=1)=frac1{n+2}$$
leading to $n=2$.
So we conclude that there is no independence if $n>3$ and there might be independence if $n=2$. To verify we must check for that case whether $P(X=i)(Y=j)=P(X=i,Y=j)$ for $i,jin{1,2}$.
I leave that to you.
$endgroup$
${X=1}$ is the event that the first bottle is a cider bottle.
Probability on that: $P(text{first cider})=frac{n}{n+2}$
${X=2}$ is the event that the first bottle contains beer and the second bottle contains cider.
Probability on that: $P(text{first beer})P(text{second cider}midtext{ first beer})=frac2{n+2}frac{n}{n+1}$.
${X=3}$ is the event that the first bottle contains beer and the second bottle contains beer.
Probability on that: $P(text{first beer})P(text{second beer}midtext{ first beer})=frac2{n+2}frac{1}{n+1}$.
Now we are ready to find:$$mathbb EX=P(X=1)+2P(X=2)+3P(X=3)=frac{n}{n+2}+2frac2{n+2}frac{n}{n+1}+3frac2{n+2}frac1{n+1}=frac{n+3}{n+1}$$
There are $2$ bottles that have index $1$ so that $P(Y=1)=frac2{n+2}$.
${X=1,Y=1}$ is the event that the first bottle is the cider bottle with index $1$.
Probability on that: $P(X=1,Y=1)=frac1{n+2}$.
So a necessary condition for independence is: $$frac{n}{n+2}frac2{n+2}=P(X=1)P(Y=1)=P(X=1,Y=1)=frac1{n+2}$$
leading to $n=2$.
So we conclude that there is no independence if $n>3$ and there might be independence if $n=2$. To verify we must check for that case whether $P(X=i)(Y=j)=P(X=i,Y=j)$ for $i,jin{1,2}$.
I leave that to you.
answered Dec 17 '18 at 10:32
drhabdrhab
102k545136
102k545136
add a comment |
add a comment |
$begingroup$
For the first part of the first problem, $X$ can take only values $1,2,$ and $3$.
For $X = 1$
$C_i ---------------$ in the first place and the rest can be filled with $2B_i$s and $(n-1) C_i$s.
For X =2
$B_i C_i ---------------$ one of the $B_i$s in the first place, $C_i$ in the second place and the rest can be filled with the remaining $B_i$s and $(n-1) C_i$s
For X = 3
$B_iB_i ----------------- $ Both the $B_i$s should occupy the first two places and the rest can be filled with the remaining $C_i$s.
Number of ways X = 1 can happen is = ${nchoose1} (n+1)!$
Number of ways X = 2 can happen is = ${2choose1}{nchoose1} n!$
Number of ways X = 3 can happen is = ${2choose1} n!$
Total number of ways =$(n+2)!$
Sanity check to see if $P(X=1)+P(X=2)+P(X=3) = 1$
$$frac{(n(n+1)! + 2nn! + 2n!)}{(n+2)!} = 1$$
Thus the expected value $$ E(X) = frac{1.{nchoose1}(n+1)!+2.{2choose1}{nchoose1} n!+3.{2choose1} n!}{(n+2)!} = frac{n+3}{n+1}$$
Second Part
For $Y = 1$
$(B_1) ---------------$ $B_1$in the first place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of (n+1)! ways
$(C_1) ---------------$ $C_1$in the first place and the rest can be filled with the other $B_i$s and $(n-1)C_i$s to a total of (n+1)! ways.
Thus $P(Y=1) = frac{(2(n+1)!)}{(n+2)!}$.
For Y =2
$-(B_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $B_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
$-(C_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $C_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
Thus$P(Y=2) = frac{(2n) n!}{(n+2)!}$
and so on for (Y=n+1) for which the probability = $P(Y=n+1) = frac{(2) n!}{(n+2)!}$
Thus $$E(Y) = frac{2(n+1).n! times 1 + 2n.n!times 2 + 2(n-1)n!times 3 +cdots + 2(2).n!times n+ 2(1).n!times (n+1)}{(n+2)!} $$ $$= frac{(n+3)}{3}$$
$endgroup$
add a comment |
$begingroup$
For the first part of the first problem, $X$ can take only values $1,2,$ and $3$.
For $X = 1$
$C_i ---------------$ in the first place and the rest can be filled with $2B_i$s and $(n-1) C_i$s.
For X =2
$B_i C_i ---------------$ one of the $B_i$s in the first place, $C_i$ in the second place and the rest can be filled with the remaining $B_i$s and $(n-1) C_i$s
For X = 3
$B_iB_i ----------------- $ Both the $B_i$s should occupy the first two places and the rest can be filled with the remaining $C_i$s.
Number of ways X = 1 can happen is = ${nchoose1} (n+1)!$
Number of ways X = 2 can happen is = ${2choose1}{nchoose1} n!$
Number of ways X = 3 can happen is = ${2choose1} n!$
Total number of ways =$(n+2)!$
Sanity check to see if $P(X=1)+P(X=2)+P(X=3) = 1$
$$frac{(n(n+1)! + 2nn! + 2n!)}{(n+2)!} = 1$$
Thus the expected value $$ E(X) = frac{1.{nchoose1}(n+1)!+2.{2choose1}{nchoose1} n!+3.{2choose1} n!}{(n+2)!} = frac{n+3}{n+1}$$
Second Part
For $Y = 1$
$(B_1) ---------------$ $B_1$in the first place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of (n+1)! ways
$(C_1) ---------------$ $C_1$in the first place and the rest can be filled with the other $B_i$s and $(n-1)C_i$s to a total of (n+1)! ways.
Thus $P(Y=1) = frac{(2(n+1)!)}{(n+2)!}$.
For Y =2
$-(B_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $B_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
$-(C_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $C_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
Thus$P(Y=2) = frac{(2n) n!}{(n+2)!}$
and so on for (Y=n+1) for which the probability = $P(Y=n+1) = frac{(2) n!}{(n+2)!}$
Thus $$E(Y) = frac{2(n+1).n! times 1 + 2n.n!times 2 + 2(n-1)n!times 3 +cdots + 2(2).n!times n+ 2(1).n!times (n+1)}{(n+2)!} $$ $$= frac{(n+3)}{3}$$
$endgroup$
add a comment |
$begingroup$
For the first part of the first problem, $X$ can take only values $1,2,$ and $3$.
For $X = 1$
$C_i ---------------$ in the first place and the rest can be filled with $2B_i$s and $(n-1) C_i$s.
For X =2
$B_i C_i ---------------$ one of the $B_i$s in the first place, $C_i$ in the second place and the rest can be filled with the remaining $B_i$s and $(n-1) C_i$s
For X = 3
$B_iB_i ----------------- $ Both the $B_i$s should occupy the first two places and the rest can be filled with the remaining $C_i$s.
Number of ways X = 1 can happen is = ${nchoose1} (n+1)!$
Number of ways X = 2 can happen is = ${2choose1}{nchoose1} n!$
Number of ways X = 3 can happen is = ${2choose1} n!$
Total number of ways =$(n+2)!$
Sanity check to see if $P(X=1)+P(X=2)+P(X=3) = 1$
$$frac{(n(n+1)! + 2nn! + 2n!)}{(n+2)!} = 1$$
Thus the expected value $$ E(X) = frac{1.{nchoose1}(n+1)!+2.{2choose1}{nchoose1} n!+3.{2choose1} n!}{(n+2)!} = frac{n+3}{n+1}$$
Second Part
For $Y = 1$
$(B_1) ---------------$ $B_1$in the first place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of (n+1)! ways
$(C_1) ---------------$ $C_1$in the first place and the rest can be filled with the other $B_i$s and $(n-1)C_i$s to a total of (n+1)! ways.
Thus $P(Y=1) = frac{(2(n+1)!)}{(n+2)!}$.
For Y =2
$-(B_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $B_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
$-(C_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $C_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
Thus$P(Y=2) = frac{(2n) n!}{(n+2)!}$
and so on for (Y=n+1) for which the probability = $P(Y=n+1) = frac{(2) n!}{(n+2)!}$
Thus $$E(Y) = frac{2(n+1).n! times 1 + 2n.n!times 2 + 2(n-1)n!times 3 +cdots + 2(2).n!times n+ 2(1).n!times (n+1)}{(n+2)!} $$ $$= frac{(n+3)}{3}$$
$endgroup$
For the first part of the first problem, $X$ can take only values $1,2,$ and $3$.
For $X = 1$
$C_i ---------------$ in the first place and the rest can be filled with $2B_i$s and $(n-1) C_i$s.
For X =2
$B_i C_i ---------------$ one of the $B_i$s in the first place, $C_i$ in the second place and the rest can be filled with the remaining $B_i$s and $(n-1) C_i$s
For X = 3
$B_iB_i ----------------- $ Both the $B_i$s should occupy the first two places and the rest can be filled with the remaining $C_i$s.
Number of ways X = 1 can happen is = ${nchoose1} (n+1)!$
Number of ways X = 2 can happen is = ${2choose1}{nchoose1} n!$
Number of ways X = 3 can happen is = ${2choose1} n!$
Total number of ways =$(n+2)!$
Sanity check to see if $P(X=1)+P(X=2)+P(X=3) = 1$
$$frac{(n(n+1)! + 2nn! + 2n!)}{(n+2)!} = 1$$
Thus the expected value $$ E(X) = frac{1.{nchoose1}(n+1)!+2.{2choose1}{nchoose1} n!+3.{2choose1} n!}{(n+2)!} = frac{n+3}{n+1}$$
Second Part
For $Y = 1$
$(B_1) ---------------$ $B_1$in the first place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of (n+1)! ways
$(C_1) ---------------$ $C_1$in the first place and the rest can be filled with the other $B_i$s and $(n-1)C_i$s to a total of (n+1)! ways.
Thus $P(Y=1) = frac{(2(n+1)!)}{(n+2)!}$.
For Y =2
$-(B_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $B_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
$-(C_1) ---------------$ The first place can be occupied with $(C_2-C_n)$ and $B_2$ and $C_1$in the second place and the rest can be filled with the other $B_2$s and $(n)C_i$s to total of ${nchoose1}n!$ ways
Thus$P(Y=2) = frac{(2n) n!}{(n+2)!}$
and so on for (Y=n+1) for which the probability = $P(Y=n+1) = frac{(2) n!}{(n+2)!}$
Thus $$E(Y) = frac{2(n+1).n! times 1 + 2n.n!times 2 + 2(n-1)n!times 3 +cdots + 2(2).n!times n+ 2(1).n!times (n+1)}{(n+2)!} $$ $$= frac{(n+3)}{3}$$
edited Dec 18 '18 at 6:54
answered Dec 17 '18 at 9:35
Satish RamanathanSatish Ramanathan
10k31323
10k31323
add a comment |
add a comment |
1
$begingroup$
Please don't ask multiple unrelated questions in one post.
$endgroup$
– Bungo
Dec 17 '18 at 6:22