Finding pdf of function of independent random variables











up vote
0
down vote

favorite













Suppose $X$ and $Y$ are i.i.d with a common pdf



$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$



Show that $X + Y$ and $X/Y$ are independent.




I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.





I believe the joint densities of $X$ and $Y$ are given by



$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$



and $0$ otherwise.



Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by



$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$



The joint density of $H$ and $G$ is given by



$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$



$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$



I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.










share|cite|improve this question


















  • 2




    Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
    – NCh
    Nov 15 at 0:26










  • An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
    – Henry
    Nov 15 at 1:07

















up vote
0
down vote

favorite













Suppose $X$ and $Y$ are i.i.d with a common pdf



$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$



Show that $X + Y$ and $X/Y$ are independent.




I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.





I believe the joint densities of $X$ and $Y$ are given by



$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$



and $0$ otherwise.



Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by



$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$



The joint density of $H$ and $G$ is given by



$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$



$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$



I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.










share|cite|improve this question


















  • 2




    Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
    – NCh
    Nov 15 at 0:26










  • An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
    – Henry
    Nov 15 at 1:07















up vote
0
down vote

favorite









up vote
0
down vote

favorite












Suppose $X$ and $Y$ are i.i.d with a common pdf



$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$



Show that $X + Y$ and $X/Y$ are independent.




I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.





I believe the joint densities of $X$ and $Y$ are given by



$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$



and $0$ otherwise.



Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by



$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$



The joint density of $H$ and $G$ is given by



$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$



$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$



I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.










share|cite|improve this question














Suppose $X$ and $Y$ are i.i.d with a common pdf



$$f(t) = begin{cases} text{exp}(-t) & text{ if } t > 0 \ 0, &
> text{ otherwise}. end{cases} $$



Show that $X + Y$ and $X/Y$ are independent.




I think to solve this problem, the approach will be to compute the joint density of $X$ and $Y$. Then, make a transformation using a Jacobian and compute the joint density of $X + Y $ and $X/Y$, and use those to find the marginal densities of $X + Y$ and $X/Y$, and see if their products equals the joint density.





I believe the joint densities of $X$ and $Y$ are given by



$$text{exp}(-x) cdot text{exp}(-y) = text{exp}(-(x + y)) text{ if } t > 0$$



and $0$ otherwise.



Define $h(X, Y) = X + Y$ and define $g(X, Y) = X/Y$. Then, the Jacobian is given by



$$J(x, y)= frac{partial h}{partial x} frac{partial g}{partial y} - frac{partial h}{partial y}frac{partial g}{partial x} = left(1right)left(-frac{X}{Y^{2}}right) - left(1right)left(frac{1}{Y}right) = -frac{X + Y}{Y^{2}}.$$



The joint density of $H$ and $G$ is given by



$$f_{HG}(h, g) = f_{XY}(x, y) cdot |J(x, y)|^{-1}$$



$$= text{exp}left(-(x + y)right) cdot frac{Y^2}{X + Y}.$$



I don't understand what I'm doing wrong though, because I think that it should be in terms of $h$ and $g$.







probability multivariable-calculus






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 15 at 0:05









joseph

958




958








  • 2




    Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
    – NCh
    Nov 15 at 0:26










  • An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
    – Henry
    Nov 15 at 1:07
















  • 2




    Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
    – NCh
    Nov 15 at 0:26










  • An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
    – Henry
    Nov 15 at 1:07










2




2




Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26




Express $x=x(h,g)$ and $y=y(h,g)$ in terms of $h$ and $g$ and substitute.
– NCh
Nov 15 at 0:26












An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07






An alternative approach is to show that, given $X+Y=h$, you have $X$ uniformly distributed on $[0,h]$ and so $G=frac{X}{Y}=frac{X}{h-X}$ has a cumulative distribution function of $Pleft(frac{X}{Y} le gright)= frac{g}{1+g}$ and a density of $frac{1}{(1+g)^2}$ on the positive reals, neither of which are affected by the particular value of $h$
– Henry
Nov 15 at 1:07

















active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999000%2ffinding-pdf-of-function-of-independent-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2999000%2ffinding-pdf-of-function-of-independent-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix