Finding the MVUE from two independent random samples












1












$begingroup$


Suppose we have a random sample $X_1, X_2, ldots, X_n$ from exponential$~(β >0)$
$text{i.e. }f(xmid β) = {1/β} ~e^{−x/β}$



and a random sample$~Y_1, Y_2, ldots, Y_n$ from exponential$~(⍺ >0)$ and assume both sample are independent.



let$~~ θ = P(X_1 < Y_1)$ Find the MVUE of $~θ~~$ for $n=2$.



So, first I calculate the $θ=int_0^ ∞int_x^infty frac{1}{β} ~e^{−x/β}~~ frac{1}{⍺ } ~e^{−y/⍺} , dy , dx= fracalpha {alpha+beta}$



Then since $f(X,Y) =f(X)cdot f(Y) =f(X_1)cdot f(X_2)cdot f(Y_1)cdot f(Y_2)$ belongs to exponential family then
$(x_1+x_2, y_1+y_2)$ is complete sufficient statistics.



$x_1+x_2sim operatorname{Gamma}(2,beta)$ and $y_1+y_2sim operatorname{Gamma}(2,alpha).$



Now, I am stuck, any help please.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
    $endgroup$
    – Jesper Hybel
    Jan 1 at 4:10












  • $begingroup$
    My bad, minimum variance unbiased estimator
    $endgroup$
    – user0533535412
    Jan 1 at 7:19
















1












$begingroup$


Suppose we have a random sample $X_1, X_2, ldots, X_n$ from exponential$~(β >0)$
$text{i.e. }f(xmid β) = {1/β} ~e^{−x/β}$



and a random sample$~Y_1, Y_2, ldots, Y_n$ from exponential$~(⍺ >0)$ and assume both sample are independent.



let$~~ θ = P(X_1 < Y_1)$ Find the MVUE of $~θ~~$ for $n=2$.



So, first I calculate the $θ=int_0^ ∞int_x^infty frac{1}{β} ~e^{−x/β}~~ frac{1}{⍺ } ~e^{−y/⍺} , dy , dx= fracalpha {alpha+beta}$



Then since $f(X,Y) =f(X)cdot f(Y) =f(X_1)cdot f(X_2)cdot f(Y_1)cdot f(Y_2)$ belongs to exponential family then
$(x_1+x_2, y_1+y_2)$ is complete sufficient statistics.



$x_1+x_2sim operatorname{Gamma}(2,beta)$ and $y_1+y_2sim operatorname{Gamma}(2,alpha).$



Now, I am stuck, any help please.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
    $endgroup$
    – Jesper Hybel
    Jan 1 at 4:10












  • $begingroup$
    My bad, minimum variance unbiased estimator
    $endgroup$
    – user0533535412
    Jan 1 at 7:19














1












1








1


1



$begingroup$


Suppose we have a random sample $X_1, X_2, ldots, X_n$ from exponential$~(β >0)$
$text{i.e. }f(xmid β) = {1/β} ~e^{−x/β}$



and a random sample$~Y_1, Y_2, ldots, Y_n$ from exponential$~(⍺ >0)$ and assume both sample are independent.



let$~~ θ = P(X_1 < Y_1)$ Find the MVUE of $~θ~~$ for $n=2$.



So, first I calculate the $θ=int_0^ ∞int_x^infty frac{1}{β} ~e^{−x/β}~~ frac{1}{⍺ } ~e^{−y/⍺} , dy , dx= fracalpha {alpha+beta}$



Then since $f(X,Y) =f(X)cdot f(Y) =f(X_1)cdot f(X_2)cdot f(Y_1)cdot f(Y_2)$ belongs to exponential family then
$(x_1+x_2, y_1+y_2)$ is complete sufficient statistics.



$x_1+x_2sim operatorname{Gamma}(2,beta)$ and $y_1+y_2sim operatorname{Gamma}(2,alpha).$



Now, I am stuck, any help please.










share|cite|improve this question











$endgroup$




Suppose we have a random sample $X_1, X_2, ldots, X_n$ from exponential$~(β >0)$
$text{i.e. }f(xmid β) = {1/β} ~e^{−x/β}$



and a random sample$~Y_1, Y_2, ldots, Y_n$ from exponential$~(⍺ >0)$ and assume both sample are independent.



let$~~ θ = P(X_1 < Y_1)$ Find the MVUE of $~θ~~$ for $n=2$.



So, first I calculate the $θ=int_0^ ∞int_x^infty frac{1}{β} ~e^{−x/β}~~ frac{1}{⍺ } ~e^{−y/⍺} , dy , dx= fracalpha {alpha+beta}$



Then since $f(X,Y) =f(X)cdot f(Y) =f(X_1)cdot f(X_2)cdot f(Y_1)cdot f(Y_2)$ belongs to exponential family then
$(x_1+x_2, y_1+y_2)$ is complete sufficient statistics.



$x_1+x_2sim operatorname{Gamma}(2,beta)$ and $y_1+y_2sim operatorname{Gamma}(2,alpha).$



Now, I am stuck, any help please.







mathematical-statistics inference information-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 1 at 7:07









Michael Hardy

3,7051430




3,7051430










asked Jan 1 at 2:50









user0533535412user0533535412

213




213












  • $begingroup$
    Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
    $endgroup$
    – Jesper Hybel
    Jan 1 at 4:10












  • $begingroup$
    My bad, minimum variance unbiased estimator
    $endgroup$
    – user0533535412
    Jan 1 at 7:19


















  • $begingroup$
    Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
    $endgroup$
    – Jesper Hybel
    Jan 1 at 4:10












  • $begingroup$
    My bad, minimum variance unbiased estimator
    $endgroup$
    – user0533535412
    Jan 1 at 7:19
















$begingroup$
Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
$endgroup$
– Jesper Hybel
Jan 1 at 4:10






$begingroup$
Im sorry I dont really understand what your are trying to calculate? What is MVUE and is this what you are trying to find?
$endgroup$
– Jesper Hybel
Jan 1 at 4:10














$begingroup$
My bad, minimum variance unbiased estimator
$endgroup$
– user0533535412
Jan 1 at 7:19




$begingroup$
My bad, minimum variance unbiased estimator
$endgroup$
– user0533535412
Jan 1 at 7:19










1 Answer
1






active

oldest

votes


















4












$begingroup$

Some of your notation is abominable and most unfortunately, you are in good company. You're using the same letter $f$ to refer to several different functions. If instead one writes $f_X$ and $f_Y$ then one can understand the difference between $f_X(3)$ and $f_Y(3),$ and one can understand things like $Pr(Xle x)$ (where $X$ and $x$ are two different things).



And you should say $X_1+X_2,$ rather than $x_1+x_2,$ has a gamma distribution, and similarly for the other one.



You have
$$
f_{X_1,X_2}(x_1,x_2) = frac 1 {beta^2} e^{-(x_1+x_2)/beta} quadtext{for } x_1,x_2 ge 0,
$$

and the fact that this depends on $(x_1,x_2)$ only through $x_1+x_2$ is sufficient (but not necessary) to establish that $X_1+X_2$ (not $x_1+x_2$) is a sufficient statistic for $beta.$



Showing completeness is another matter, but before that let's Rao–Blackwellize.



Let $W = begin{cases} 1 & text{if } X_1 < Y_1, \ 0 & text{otherwise.} end{cases}$



Then $W$ is an unbiased estimator of $theta.$ So the Rao–Blackwell estimator is
begin{align}
& operatorname E(Wmid X_1+X_2, Y_1+Y_2) \[10pt]
= {} & Pr(W=1mid X_1+X_2, Y_1+Y_2) \[10pt]
= {} & Pr(X_1<Y_1mid X_1+X_2, Y_1+Y_2).
end{align}

The conditional distribution of $X_1$ given that $X_1+X_2=x$ is uniform on the interval $[0,x]$ because the joint density of $(X_1,X_2)$ is constant on that set. Similarly the conditional distribution of $Y_1$ given $Y_1+Y_2=y$ is uniform on $[0,y].$ Hence the conditional distribution of $(U_1,U_2)=(X_1/x,Y_1/y)$ given $X_1+X_2=x, , Y_1+Y_2=y$ is uniform in the square $[0,1]times[0,1].$ We seek $Prleft( U_1 < dfrac y x U_2 right).$
$$
Prleft(U_1 < frac y x U_2 right) = begin{cases} y/(2x) & text{if } x ge y \[8pt] 1 - x/(2y) & text{if } x le y. end{cases}
$$

So the Rao–Blackwell estimator is
$$
frac 1 2 times begin{cases} frac{Y_1+Y_2}{X_1+X_2} & text{if that is} le 1/2, \[8pt] 1-frac{X_1+X_2}{Y_2+Y_2} & text{if that is} ge 1/2. end{cases}
$$



That's the UMVUE if we have completeness.



begin{align}
& operatorname E(g(X_1+X_2)) \[8pt]
= {} & frac 1 {Gamma(2)} int_0^infty g(x) x^{2-1} e^{-x/beta} , frac{dx} beta.
end{align}

This is the Laplace transform, evaluated at $1/beta,$ of $xmapsto xg(x).$ We want it to be $0$ regardless of the value of $beta.$ That can happen only if $xg(x)$ is $0$ for all values of $xge0.$ Thus we have no nontrivial unbiased estimators of zero.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "65"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f385141%2ffinding-the-mvue-from-two-independent-random-samples%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    4












    $begingroup$

    Some of your notation is abominable and most unfortunately, you are in good company. You're using the same letter $f$ to refer to several different functions. If instead one writes $f_X$ and $f_Y$ then one can understand the difference between $f_X(3)$ and $f_Y(3),$ and one can understand things like $Pr(Xle x)$ (where $X$ and $x$ are two different things).



    And you should say $X_1+X_2,$ rather than $x_1+x_2,$ has a gamma distribution, and similarly for the other one.



    You have
    $$
    f_{X_1,X_2}(x_1,x_2) = frac 1 {beta^2} e^{-(x_1+x_2)/beta} quadtext{for } x_1,x_2 ge 0,
    $$

    and the fact that this depends on $(x_1,x_2)$ only through $x_1+x_2$ is sufficient (but not necessary) to establish that $X_1+X_2$ (not $x_1+x_2$) is a sufficient statistic for $beta.$



    Showing completeness is another matter, but before that let's Rao–Blackwellize.



    Let $W = begin{cases} 1 & text{if } X_1 < Y_1, \ 0 & text{otherwise.} end{cases}$



    Then $W$ is an unbiased estimator of $theta.$ So the Rao–Blackwell estimator is
    begin{align}
    & operatorname E(Wmid X_1+X_2, Y_1+Y_2) \[10pt]
    = {} & Pr(W=1mid X_1+X_2, Y_1+Y_2) \[10pt]
    = {} & Pr(X_1<Y_1mid X_1+X_2, Y_1+Y_2).
    end{align}

    The conditional distribution of $X_1$ given that $X_1+X_2=x$ is uniform on the interval $[0,x]$ because the joint density of $(X_1,X_2)$ is constant on that set. Similarly the conditional distribution of $Y_1$ given $Y_1+Y_2=y$ is uniform on $[0,y].$ Hence the conditional distribution of $(U_1,U_2)=(X_1/x,Y_1/y)$ given $X_1+X_2=x, , Y_1+Y_2=y$ is uniform in the square $[0,1]times[0,1].$ We seek $Prleft( U_1 < dfrac y x U_2 right).$
    $$
    Prleft(U_1 < frac y x U_2 right) = begin{cases} y/(2x) & text{if } x ge y \[8pt] 1 - x/(2y) & text{if } x le y. end{cases}
    $$

    So the Rao–Blackwell estimator is
    $$
    frac 1 2 times begin{cases} frac{Y_1+Y_2}{X_1+X_2} & text{if that is} le 1/2, \[8pt] 1-frac{X_1+X_2}{Y_2+Y_2} & text{if that is} ge 1/2. end{cases}
    $$



    That's the UMVUE if we have completeness.



    begin{align}
    & operatorname E(g(X_1+X_2)) \[8pt]
    = {} & frac 1 {Gamma(2)} int_0^infty g(x) x^{2-1} e^{-x/beta} , frac{dx} beta.
    end{align}

    This is the Laplace transform, evaluated at $1/beta,$ of $xmapsto xg(x).$ We want it to be $0$ regardless of the value of $beta.$ That can happen only if $xg(x)$ is $0$ for all values of $xge0.$ Thus we have no nontrivial unbiased estimators of zero.






    share|cite|improve this answer











    $endgroup$


















      4












      $begingroup$

      Some of your notation is abominable and most unfortunately, you are in good company. You're using the same letter $f$ to refer to several different functions. If instead one writes $f_X$ and $f_Y$ then one can understand the difference between $f_X(3)$ and $f_Y(3),$ and one can understand things like $Pr(Xle x)$ (where $X$ and $x$ are two different things).



      And you should say $X_1+X_2,$ rather than $x_1+x_2,$ has a gamma distribution, and similarly for the other one.



      You have
      $$
      f_{X_1,X_2}(x_1,x_2) = frac 1 {beta^2} e^{-(x_1+x_2)/beta} quadtext{for } x_1,x_2 ge 0,
      $$

      and the fact that this depends on $(x_1,x_2)$ only through $x_1+x_2$ is sufficient (but not necessary) to establish that $X_1+X_2$ (not $x_1+x_2$) is a sufficient statistic for $beta.$



      Showing completeness is another matter, but before that let's Rao–Blackwellize.



      Let $W = begin{cases} 1 & text{if } X_1 < Y_1, \ 0 & text{otherwise.} end{cases}$



      Then $W$ is an unbiased estimator of $theta.$ So the Rao–Blackwell estimator is
      begin{align}
      & operatorname E(Wmid X_1+X_2, Y_1+Y_2) \[10pt]
      = {} & Pr(W=1mid X_1+X_2, Y_1+Y_2) \[10pt]
      = {} & Pr(X_1<Y_1mid X_1+X_2, Y_1+Y_2).
      end{align}

      The conditional distribution of $X_1$ given that $X_1+X_2=x$ is uniform on the interval $[0,x]$ because the joint density of $(X_1,X_2)$ is constant on that set. Similarly the conditional distribution of $Y_1$ given $Y_1+Y_2=y$ is uniform on $[0,y].$ Hence the conditional distribution of $(U_1,U_2)=(X_1/x,Y_1/y)$ given $X_1+X_2=x, , Y_1+Y_2=y$ is uniform in the square $[0,1]times[0,1].$ We seek $Prleft( U_1 < dfrac y x U_2 right).$
      $$
      Prleft(U_1 < frac y x U_2 right) = begin{cases} y/(2x) & text{if } x ge y \[8pt] 1 - x/(2y) & text{if } x le y. end{cases}
      $$

      So the Rao–Blackwell estimator is
      $$
      frac 1 2 times begin{cases} frac{Y_1+Y_2}{X_1+X_2} & text{if that is} le 1/2, \[8pt] 1-frac{X_1+X_2}{Y_2+Y_2} & text{if that is} ge 1/2. end{cases}
      $$



      That's the UMVUE if we have completeness.



      begin{align}
      & operatorname E(g(X_1+X_2)) \[8pt]
      = {} & frac 1 {Gamma(2)} int_0^infty g(x) x^{2-1} e^{-x/beta} , frac{dx} beta.
      end{align}

      This is the Laplace transform, evaluated at $1/beta,$ of $xmapsto xg(x).$ We want it to be $0$ regardless of the value of $beta.$ That can happen only if $xg(x)$ is $0$ for all values of $xge0.$ Thus we have no nontrivial unbiased estimators of zero.






      share|cite|improve this answer











      $endgroup$
















        4












        4








        4





        $begingroup$

        Some of your notation is abominable and most unfortunately, you are in good company. You're using the same letter $f$ to refer to several different functions. If instead one writes $f_X$ and $f_Y$ then one can understand the difference between $f_X(3)$ and $f_Y(3),$ and one can understand things like $Pr(Xle x)$ (where $X$ and $x$ are two different things).



        And you should say $X_1+X_2,$ rather than $x_1+x_2,$ has a gamma distribution, and similarly for the other one.



        You have
        $$
        f_{X_1,X_2}(x_1,x_2) = frac 1 {beta^2} e^{-(x_1+x_2)/beta} quadtext{for } x_1,x_2 ge 0,
        $$

        and the fact that this depends on $(x_1,x_2)$ only through $x_1+x_2$ is sufficient (but not necessary) to establish that $X_1+X_2$ (not $x_1+x_2$) is a sufficient statistic for $beta.$



        Showing completeness is another matter, but before that let's Rao–Blackwellize.



        Let $W = begin{cases} 1 & text{if } X_1 < Y_1, \ 0 & text{otherwise.} end{cases}$



        Then $W$ is an unbiased estimator of $theta.$ So the Rao–Blackwell estimator is
        begin{align}
        & operatorname E(Wmid X_1+X_2, Y_1+Y_2) \[10pt]
        = {} & Pr(W=1mid X_1+X_2, Y_1+Y_2) \[10pt]
        = {} & Pr(X_1<Y_1mid X_1+X_2, Y_1+Y_2).
        end{align}

        The conditional distribution of $X_1$ given that $X_1+X_2=x$ is uniform on the interval $[0,x]$ because the joint density of $(X_1,X_2)$ is constant on that set. Similarly the conditional distribution of $Y_1$ given $Y_1+Y_2=y$ is uniform on $[0,y].$ Hence the conditional distribution of $(U_1,U_2)=(X_1/x,Y_1/y)$ given $X_1+X_2=x, , Y_1+Y_2=y$ is uniform in the square $[0,1]times[0,1].$ We seek $Prleft( U_1 < dfrac y x U_2 right).$
        $$
        Prleft(U_1 < frac y x U_2 right) = begin{cases} y/(2x) & text{if } x ge y \[8pt] 1 - x/(2y) & text{if } x le y. end{cases}
        $$

        So the Rao–Blackwell estimator is
        $$
        frac 1 2 times begin{cases} frac{Y_1+Y_2}{X_1+X_2} & text{if that is} le 1/2, \[8pt] 1-frac{X_1+X_2}{Y_2+Y_2} & text{if that is} ge 1/2. end{cases}
        $$



        That's the UMVUE if we have completeness.



        begin{align}
        & operatorname E(g(X_1+X_2)) \[8pt]
        = {} & frac 1 {Gamma(2)} int_0^infty g(x) x^{2-1} e^{-x/beta} , frac{dx} beta.
        end{align}

        This is the Laplace transform, evaluated at $1/beta,$ of $xmapsto xg(x).$ We want it to be $0$ regardless of the value of $beta.$ That can happen only if $xg(x)$ is $0$ for all values of $xge0.$ Thus we have no nontrivial unbiased estimators of zero.






        share|cite|improve this answer











        $endgroup$



        Some of your notation is abominable and most unfortunately, you are in good company. You're using the same letter $f$ to refer to several different functions. If instead one writes $f_X$ and $f_Y$ then one can understand the difference between $f_X(3)$ and $f_Y(3),$ and one can understand things like $Pr(Xle x)$ (where $X$ and $x$ are two different things).



        And you should say $X_1+X_2,$ rather than $x_1+x_2,$ has a gamma distribution, and similarly for the other one.



        You have
        $$
        f_{X_1,X_2}(x_1,x_2) = frac 1 {beta^2} e^{-(x_1+x_2)/beta} quadtext{for } x_1,x_2 ge 0,
        $$

        and the fact that this depends on $(x_1,x_2)$ only through $x_1+x_2$ is sufficient (but not necessary) to establish that $X_1+X_2$ (not $x_1+x_2$) is a sufficient statistic for $beta.$



        Showing completeness is another matter, but before that let's Rao–Blackwellize.



        Let $W = begin{cases} 1 & text{if } X_1 < Y_1, \ 0 & text{otherwise.} end{cases}$



        Then $W$ is an unbiased estimator of $theta.$ So the Rao–Blackwell estimator is
        begin{align}
        & operatorname E(Wmid X_1+X_2, Y_1+Y_2) \[10pt]
        = {} & Pr(W=1mid X_1+X_2, Y_1+Y_2) \[10pt]
        = {} & Pr(X_1<Y_1mid X_1+X_2, Y_1+Y_2).
        end{align}

        The conditional distribution of $X_1$ given that $X_1+X_2=x$ is uniform on the interval $[0,x]$ because the joint density of $(X_1,X_2)$ is constant on that set. Similarly the conditional distribution of $Y_1$ given $Y_1+Y_2=y$ is uniform on $[0,y].$ Hence the conditional distribution of $(U_1,U_2)=(X_1/x,Y_1/y)$ given $X_1+X_2=x, , Y_1+Y_2=y$ is uniform in the square $[0,1]times[0,1].$ We seek $Prleft( U_1 < dfrac y x U_2 right).$
        $$
        Prleft(U_1 < frac y x U_2 right) = begin{cases} y/(2x) & text{if } x ge y \[8pt] 1 - x/(2y) & text{if } x le y. end{cases}
        $$

        So the Rao–Blackwell estimator is
        $$
        frac 1 2 times begin{cases} frac{Y_1+Y_2}{X_1+X_2} & text{if that is} le 1/2, \[8pt] 1-frac{X_1+X_2}{Y_2+Y_2} & text{if that is} ge 1/2. end{cases}
        $$



        That's the UMVUE if we have completeness.



        begin{align}
        & operatorname E(g(X_1+X_2)) \[8pt]
        = {} & frac 1 {Gamma(2)} int_0^infty g(x) x^{2-1} e^{-x/beta} , frac{dx} beta.
        end{align}

        This is the Laplace transform, evaluated at $1/beta,$ of $xmapsto xg(x).$ We want it to be $0$ regardless of the value of $beta.$ That can happen only if $xg(x)$ is $0$ for all values of $xge0.$ Thus we have no nontrivial unbiased estimators of zero.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 1 at 20:43

























        answered Jan 1 at 7:06









        Michael HardyMichael Hardy

        3,7051430




        3,7051430






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f385141%2ffinding-the-mvue-from-two-independent-random-samples%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Probability when a professor distributes a quiz and homework assignment to a class of n students.

            Aardman Animations

            Are they similar matrix