Why is the maximum of i.i.d. Gaussians asymptotically $sqrt{2 log n}$?
up vote
3
down vote
favorite
Assuming that $xi$ is bounded (as a function of $x$?), the claim is that given the equation:
$$xi frac{sqrt{2pi}}{n} = frac{1}{x} e^{-frac{x^2}{2}} left( 1 + Oleft(frac{1}{x^2} right) right) $$
one can solve ("after some calculation") for $x$ to get:
$$x = sqrt{2 log n} - frac{log log n + log 4 pi}{2 sqrt{2 log n}} - frac{log xi}{sqrt{2 log n}} + Oleft( frac{1}{log n} right) ,. $$
Question: Would it be possible to get some hints about how to solve for $x$ in this situation?
There are several issues about this which I don't understand:
- How is the assumption that $xi$ is bounded used or otherwise relevant?
- Why is the "imprecise knowledge of $x$ transferred to $n$" when solving for $x$, and not "transferred" to some other variable? If we require an assumption on $xi$, then why isn't the "imprecise knowledge transferred" to $xi$? (Why don't we get an $O(f(xi))$ term for some $f$?
- Do we have to use/calculate some inverse function to $g(x) := frac{1}{x} e^{-frac{x^2}{2}}$? This function isn't even defined, strictly speaking, at $x=0$, but perhaps it has a continuous extension over the entire real line? If it does have a continuous extension over $mathbb{R}$, is that continuous extension even an invertible function, such that talking about $g^{-1}(x)$ even makes sense?
- If we do calculate such an inverse function, do we then basically proceed by applying that function to both sides of the equation and ignoring the $O(x^{-2})$ term, with the understanding that the "uncertainty" contained within it now needs to be transferred somewhere else? If so, this leads back to the above question about where the $O((log n)^{-1})$ term could come from.
Context: I don't think the context is actually relevant to solving this problem, but for the record this comes up on p. 374 of Cramer's 1946 Mathematical Methods of Statistics where an asymptotic form for the maximum of i.i.d. Gaussian random variables is sought.
probability statistics asymptotics problem-solving order-statistics
add a comment |
up vote
3
down vote
favorite
Assuming that $xi$ is bounded (as a function of $x$?), the claim is that given the equation:
$$xi frac{sqrt{2pi}}{n} = frac{1}{x} e^{-frac{x^2}{2}} left( 1 + Oleft(frac{1}{x^2} right) right) $$
one can solve ("after some calculation") for $x$ to get:
$$x = sqrt{2 log n} - frac{log log n + log 4 pi}{2 sqrt{2 log n}} - frac{log xi}{sqrt{2 log n}} + Oleft( frac{1}{log n} right) ,. $$
Question: Would it be possible to get some hints about how to solve for $x$ in this situation?
There are several issues about this which I don't understand:
- How is the assumption that $xi$ is bounded used or otherwise relevant?
- Why is the "imprecise knowledge of $x$ transferred to $n$" when solving for $x$, and not "transferred" to some other variable? If we require an assumption on $xi$, then why isn't the "imprecise knowledge transferred" to $xi$? (Why don't we get an $O(f(xi))$ term for some $f$?
- Do we have to use/calculate some inverse function to $g(x) := frac{1}{x} e^{-frac{x^2}{2}}$? This function isn't even defined, strictly speaking, at $x=0$, but perhaps it has a continuous extension over the entire real line? If it does have a continuous extension over $mathbb{R}$, is that continuous extension even an invertible function, such that talking about $g^{-1}(x)$ even makes sense?
- If we do calculate such an inverse function, do we then basically proceed by applying that function to both sides of the equation and ignoring the $O(x^{-2})$ term, with the understanding that the "uncertainty" contained within it now needs to be transferred somewhere else? If so, this leads back to the above question about where the $O((log n)^{-1})$ term could come from.
Context: I don't think the context is actually relevant to solving this problem, but for the record this comes up on p. 374 of Cramer's 1946 Mathematical Methods of Statistics where an asymptotic form for the maximum of i.i.d. Gaussian random variables is sought.
probability statistics asymptotics problem-solving order-statistics
add a comment |
up vote
3
down vote
favorite
up vote
3
down vote
favorite
Assuming that $xi$ is bounded (as a function of $x$?), the claim is that given the equation:
$$xi frac{sqrt{2pi}}{n} = frac{1}{x} e^{-frac{x^2}{2}} left( 1 + Oleft(frac{1}{x^2} right) right) $$
one can solve ("after some calculation") for $x$ to get:
$$x = sqrt{2 log n} - frac{log log n + log 4 pi}{2 sqrt{2 log n}} - frac{log xi}{sqrt{2 log n}} + Oleft( frac{1}{log n} right) ,. $$
Question: Would it be possible to get some hints about how to solve for $x$ in this situation?
There are several issues about this which I don't understand:
- How is the assumption that $xi$ is bounded used or otherwise relevant?
- Why is the "imprecise knowledge of $x$ transferred to $n$" when solving for $x$, and not "transferred" to some other variable? If we require an assumption on $xi$, then why isn't the "imprecise knowledge transferred" to $xi$? (Why don't we get an $O(f(xi))$ term for some $f$?
- Do we have to use/calculate some inverse function to $g(x) := frac{1}{x} e^{-frac{x^2}{2}}$? This function isn't even defined, strictly speaking, at $x=0$, but perhaps it has a continuous extension over the entire real line? If it does have a continuous extension over $mathbb{R}$, is that continuous extension even an invertible function, such that talking about $g^{-1}(x)$ even makes sense?
- If we do calculate such an inverse function, do we then basically proceed by applying that function to both sides of the equation and ignoring the $O(x^{-2})$ term, with the understanding that the "uncertainty" contained within it now needs to be transferred somewhere else? If so, this leads back to the above question about where the $O((log n)^{-1})$ term could come from.
Context: I don't think the context is actually relevant to solving this problem, but for the record this comes up on p. 374 of Cramer's 1946 Mathematical Methods of Statistics where an asymptotic form for the maximum of i.i.d. Gaussian random variables is sought.
probability statistics asymptotics problem-solving order-statistics
Assuming that $xi$ is bounded (as a function of $x$?), the claim is that given the equation:
$$xi frac{sqrt{2pi}}{n} = frac{1}{x} e^{-frac{x^2}{2}} left( 1 + Oleft(frac{1}{x^2} right) right) $$
one can solve ("after some calculation") for $x$ to get:
$$x = sqrt{2 log n} - frac{log log n + log 4 pi}{2 sqrt{2 log n}} - frac{log xi}{sqrt{2 log n}} + Oleft( frac{1}{log n} right) ,. $$
Question: Would it be possible to get some hints about how to solve for $x$ in this situation?
There are several issues about this which I don't understand:
- How is the assumption that $xi$ is bounded used or otherwise relevant?
- Why is the "imprecise knowledge of $x$ transferred to $n$" when solving for $x$, and not "transferred" to some other variable? If we require an assumption on $xi$, then why isn't the "imprecise knowledge transferred" to $xi$? (Why don't we get an $O(f(xi))$ term for some $f$?
- Do we have to use/calculate some inverse function to $g(x) := frac{1}{x} e^{-frac{x^2}{2}}$? This function isn't even defined, strictly speaking, at $x=0$, but perhaps it has a continuous extension over the entire real line? If it does have a continuous extension over $mathbb{R}$, is that continuous extension even an invertible function, such that talking about $g^{-1}(x)$ even makes sense?
- If we do calculate such an inverse function, do we then basically proceed by applying that function to both sides of the equation and ignoring the $O(x^{-2})$ term, with the understanding that the "uncertainty" contained within it now needs to be transferred somewhere else? If so, this leads back to the above question about where the $O((log n)^{-1})$ term could come from.
Context: I don't think the context is actually relevant to solving this problem, but for the record this comes up on p. 374 of Cramer's 1946 Mathematical Methods of Statistics where an asymptotic form for the maximum of i.i.d. Gaussian random variables is sought.
probability statistics asymptotics problem-solving order-statistics
probability statistics asymptotics problem-solving order-statistics
edited Nov 16 at 21:54
asked Nov 13 at 0:10
hasManyStupidQuestions
335
335
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996092%2fwhy-is-the-maximum-of-i-i-d-gaussians-asymptotically-sqrt2-log-n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown