How do I determine the divergence/convergence of $sum_n frac{1}{log(log(n))}$?
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
add a comment |
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
add a comment |
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
I am working through some problems in Durrett's probability book, and one of them involves a variant of the law of iterated logarithm.
I've managed to reduce the result to showing that
$$sum_n frac{1}{log log (n)}exp(-log log(n)) < infty$$
Using upperbounds for tail probabilities of standard normal. But, as I'm really not good with this stuff, I'm unsure how I am supposed to show this (if it's indeed true?).
Clearly without the exponential, the series would diverge since $log (n) leq n$. But, with the exponential it seems as though this could converge.
Could someone advise me how to complete this last step?
convergence
convergence
edited Dec 1 '18 at 6:15
Chinnapparaj R
5,2481828
5,2481828
asked Dec 1 '18 at 6:12
XiaomiXiaomi
1,031115
1,031115
add a comment |
add a comment |
5 Answers
5
active
oldest
votes
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
add a comment |
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
add a comment |
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
add a comment |
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021045%2fhow-do-i-determine-the-divergence-convergence-of-sum-n-frac1-log-logn%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
add a comment |
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
add a comment |
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
We can simplify
$$exp(- log log n) = frac{1}{e^{log log n}} = frac{1}{log n}$$
Since $(log log n) log n leq n$, this diverges.
answered Dec 1 '18 at 6:17
plattyplatty
3,370320
3,370320
add a comment |
add a comment |
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
add a comment |
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
add a comment |
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
$sum_n frac{1}{n}$ is divergent. So is $sum_n frac{1}{log(log(n))}$
But I feel $sum_n frac{exp(-log(log(n))}{log(log(n))}$ might still be convergent, you just need to find another way to prove it.
answered Dec 1 '18 at 6:19
MoonKnightMoonKnight
1,324611
1,324611
add a comment |
add a comment |
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
add a comment |
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
add a comment |
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
$$
frac 1{(log log n )exp(log log n)} = frac 1 {log log n cdot log n} geqslant frac 1{log ^2 n} geqslant frac 1{n^{1/2 times 2}} = frac 1n,
$$
so it still diverges.
answered Dec 1 '18 at 6:18
xbhxbh
5,8281522
5,8281522
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
add a comment |
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
Oh, ok. Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$?
– Xiaomi
Dec 1 '18 at 6:21
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
@Xiaomi Sorry, I suck at probability theory.
– xbh
Dec 1 '18 at 7:15
add a comment |
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
Cauchy condensation shows
$$sum_n frac{1}{log(log(n))} sim sum_n frac{2^n}{log(log(2^n))} = sum_n frac{2^n}{log(nlog(2))} = sum_n frac{2^n}{log(n) + log(log(2))} $$
So, it is divergent.
answered Dec 1 '18 at 6:20
trancelocationtrancelocation
9,5601722
9,5601722
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
Thanks a lot. Do you have any advice on how to show $sum_n P(Z>sqrt{2log log n}(1+epsilon)) < infty$ for $Z sim N(0,1)$? The common probability inequalities do not seem tight enough to show this..
– Xiaomi
Dec 1 '18 at 6:24
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
You may use the integral for $N(0,1)$ for each member of the series and estimate it: $$P(Z>sqrt{2log log n}(1+epsilon)) sim int_{sqrt{2log log n}(1+epsilon)}^{infty} e^{frac{-x^2}{2}}dx$$
– trancelocation
Dec 1 '18 at 6:48
add a comment |
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
add a comment |
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
add a comment |
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
Credits to user MoonKnight.
$log n <n ; $
And once more:
$log (log n) lt log n lt n.$
$dfrac {1}{n} lt dfrac{1}{log n} lt dfrac{1}{log (log n)}.$
Comparison test.
answered Dec 1 '18 at 7:15
Peter SzilasPeter Szilas
10.9k2720
10.9k2720
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021045%2fhow-do-i-determine-the-divergence-convergence-of-sum-n-frac1-log-logn%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown