Differentiating an integral that grows like log asymptotically












8












$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    Jan 14 at 8:35










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    Jan 14 at 8:46






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 8:54
















8












$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    Jan 14 at 8:35










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    Jan 14 at 8:46






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 8:54














8












8








8


2



$begingroup$


Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)










share|cite|improve this question











$endgroup$




Suppose I have a continuous function $f(x)$ that is non-increasing and always stays between $0$ and $1$, and it is known that



$$ int_0^t f(x) dx = log t + o(log t), qquad t to infty.$$



Unfortunately one has no control over the error term $o(log t)$ (other than what is implied by the above asymptotic behaviour and the properties of $f$). My question is whether it is possible to conclude that



$$ f(t) = frac{1}{t} + o(t^{-1}), qquad t to infty.$$



Update: thanks to Raziel's response, the claim above does not hold in general and the problem is related to de Haan theory (which I am not very familiar with). I would therefore like to ask if it is still possible to find some $C > 0$ such that for $t$ sufficiently large,



$$ frac{1}{Ct} le f(t) le frac{C}{t}.$$



---------Old follow-up question below; please ignore---------



If this is possible, a follow-up question is whether the claim can be extended to $f$ that has countably many jumps (and continuous otherwise). Of course I will still be assuming that $f(x) in [0,1]$ and the function is non-increasing, which in particular means that the jumps are negative and the size of the jump at $x_i$ (if any) is bounded by $f(x_i)$.



(One may start by proposing a solution to the toy problem with $f$ being differentiable if it simplifies the problem and offers any useful insights.)







pr.probability real-analysis ca.classical-analysis-and-odes probability-distributions asymptotics






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 14 at 12:40







random_person

















asked Jan 14 at 8:09









random_personrandom_person

955




955












  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    Jan 14 at 8:35










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    Jan 14 at 8:46






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 8:54


















  • $begingroup$
    Am I missing something? Is this not l'Hospital's rule?
    $endgroup$
    – Venkataramana
    Jan 14 at 8:35










  • $begingroup$
    @Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
    $endgroup$
    – random_person
    Jan 14 at 8:46






  • 1




    $begingroup$
    This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 8:54
















$begingroup$
Am I missing something? Is this not l'Hospital's rule?
$endgroup$
– Venkataramana
Jan 14 at 8:35




$begingroup$
Am I missing something? Is this not l'Hospital's rule?
$endgroup$
– Venkataramana
Jan 14 at 8:35












$begingroup$
@Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
$endgroup$
– random_person
Jan 14 at 8:46




$begingroup$
@Venkataramana I could be wrong, but aren't you suggesting the converse of L'Hospital's rule (if the ratio of two functions has a limit, then the ratio of the derivatives has the same limit), which does not seem to be true in general? I have no control over $o(log t)$ and just can't say much about its derivative, and I am not sure how L'Hospital may be applied.
$endgroup$
– random_person
Jan 14 at 8:46




1




1




$begingroup$
This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 8:54




$begingroup$
This belongs to Karamata's Tauberian theory, it is a kind of monotone density theorem for de Hahn classes, I believe. I'll look into Bingham-Goldie-Teugels book later today and write more.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 8:54










2 Answers
2






active

oldest

votes


















11












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    Jan 14 at 9:40










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 10:48










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    Jan 14 at 12:19






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 12:45



















6












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    Jan 14 at 12:38










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:41












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    Jan 14 at 12:42








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:48












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    Jan 14 at 12:52











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "504"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320841%2fdifferentiating-an-integral-that-grows-like-log-asymptotically%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









11












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    Jan 14 at 9:40










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 10:48










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    Jan 14 at 12:19






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 12:45
















11












$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    Jan 14 at 9:40










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 10:48










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    Jan 14 at 12:19






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 12:45














11












11








11





$begingroup$

The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.






share|cite|improve this answer











$endgroup$



The answer is no, even in the smooth case. Take for example:



$$
f(x) = frac{2}{x} + frac{cos(log(x))}{x}
$$



Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have



$$
int_0^t f(x) dx = C + 2log(t) + sin(log(t)) = 2log(t) + o(log(t))
$$



The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:



Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 14 at 9:33

























answered Jan 14 at 9:00









RazielRaziel

2,02911425




2,02911425












  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    Jan 14 at 9:40










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 10:48










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    Jan 14 at 12:19






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 12:45


















  • $begingroup$
    What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
    $endgroup$
    – random_person
    Jan 14 at 9:40










  • $begingroup$
    Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 10:48










  • $begingroup$
    @MateuszKwaśnicki Thanks, I am looking forward to your answer.
    $endgroup$
    – random_person
    Jan 14 at 12:19






  • 1




    $begingroup$
    @random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
    $endgroup$
    – Mateusz Kwaśnicki
    Jan 14 at 12:45
















$begingroup$
What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
$endgroup$
– random_person
Jan 14 at 9:40




$begingroup$
What about bounds? Is it possible to show that there exists some $C > 0$ such that $frac{1}{Ct} le f(t) le frac{C}{t}$ for $t$ sufficiently large?
$endgroup$
– random_person
Jan 14 at 9:40












$begingroup$
Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 10:48




$begingroup$
Original asymptotic equation is equivalent to $f$ being in the right de Haan class. Two-sided estimate would correspond to de Haan's analogue of $O$-regular variation, I think, and it does not follow automatically. The counter-example is less explicit, though, I will type it later today on a computer, if you like.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 10:48












$begingroup$
@MateuszKwaśnicki Thanks, I am looking forward to your answer.
$endgroup$
– random_person
Jan 14 at 12:19




$begingroup$
@MateuszKwaśnicki Thanks, I am looking forward to your answer.
$endgroup$
– random_person
Jan 14 at 12:19




1




1




$begingroup$
@random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 12:45




$begingroup$
@random_person: I just started to type when Iosif Pinelis gave esentially the same construction. I can oly add that what I meant in my first comment was Theorem 3.6.8 in the BGT book, which gives a necessary and sufficient condition for the desired asymptotics of $f$ in terms of its primitive function. See also Sections 3.7.1–3.7.2 therein.
$endgroup$
– Mateusz Kwaśnicki
Jan 14 at 12:45











6












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    Jan 14 at 12:38










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:41












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    Jan 14 at 12:42








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:48












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    Jan 14 at 12:52
















6












$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    Jan 14 at 12:38










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:41












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    Jan 14 at 12:42








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:48












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    Jan 14 at 12:52














6












6








6





$begingroup$

The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.






share|cite|improve this answer











$endgroup$



The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)asympfrac1t$ (as $toinfty$); as usual, $aasymp b$ means here that $limsup|frac ab+frac ba|<infty$.



Let us show that the answer is still no. E.g., for $j=0,1,dots$ let $t_j:=e^{j^2}$,
begin{equation}
c_j:=frac{ln t_{j+1}-ln t_j}{t_{j+1}-t_j}simfrac{2j}{t_{j+1}} tag{1}
end{equation}

(as $jtoinfty$), and
begin{equation}
f(x):=c_jquadtext{for}quad xin[t_j,t_{j+1}),
end{equation}

with $f:=c_0=frac1{e-1}$ on $[0,t_0)$.
Let also $F(t):=int_0^t f(x),dx$.



Then $f$ is nonincreasing, $0<fle1$, $F(t_j)=c_0+ln t_jsim c_0+ln t_{j+1}=F(t_{j+1})$, whence $F(t)simln t$ (as $ttoinfty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $frac1{t_{j+1}}$, by (1).
We also see that $f(t_j)=c_j$ is much less than $frac1{t_j}$, again by (1).



The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+ln t_{j+1}sim c_0+ln t_j=F(t_j)$ for some $b_jin[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $frac1{t_{j+1}-c_0}$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 14 at 13:58

























answered Jan 14 at 12:26









Iosif PinelisIosif Pinelis

18.7k22159




18.7k22159












  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    Jan 14 at 12:38










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:41












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    Jan 14 at 12:42








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:48












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    Jan 14 at 12:52


















  • $begingroup$
    Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
    $endgroup$
    – random_person
    Jan 14 at 12:38










  • $begingroup$
    @random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:41












  • $begingroup$
    Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
    $endgroup$
    – random_person
    Jan 14 at 12:42








  • 1




    $begingroup$
    @random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
    $endgroup$
    – Iosif Pinelis
    Jan 14 at 12:48












  • $begingroup$
    I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
    $endgroup$
    – random_person
    Jan 14 at 12:52
















$begingroup$
Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
$endgroup$
– random_person
Jan 14 at 12:38




$begingroup$
Thanks for the nice counter-example. Would it still be possible to establish the upper bound $f(t) le frac{C}{t}$ though?
$endgroup$
– random_person
Jan 14 at 12:38












$begingroup$
@random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
$endgroup$
– Iosif Pinelis
Jan 14 at 12:41






$begingroup$
@random_person : This very example shows that the upper bound $frac Ct$ on $f(t)$ is impossible in general, as we have $f(t_{j+1}-)/frac1{t_{j+1}}toinfty$.
$endgroup$
– Iosif Pinelis
Jan 14 at 12:41














$begingroup$
Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
$endgroup$
– random_person
Jan 14 at 12:42






$begingroup$
Oh I have asked a dumb question. I actually want to ask if a lower bound $f(t) ge frac{1}{Ct}$ is possible.
$endgroup$
– random_person
Jan 14 at 12:42






1




1




$begingroup$
@random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
$endgroup$
– Iosif Pinelis
Jan 14 at 12:48






$begingroup$
@random_person : I have now added a sentence showing that, in the same example, the lower bound $frac1{Ct}$ on $f(t)$ is impossible either.
$endgroup$
– Iosif Pinelis
Jan 14 at 12:48














$begingroup$
I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
$endgroup$
– random_person
Jan 14 at 12:52




$begingroup$
I am feeling so embarrassed that I have missed this observation...thank you so much for your patience and again your counter-example.
$endgroup$
– random_person
Jan 14 at 12:52


















draft saved

draft discarded




















































Thanks for contributing an answer to MathOverflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f320841%2fdifferentiating-an-integral-that-grows-like-log-asymptotically%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix