Construction of function with function going to 0 but derivative not as X tends to infinity











up vote
0
down vote

favorite












I am interested in finding Example of twice differentiable function on $(0,infty)$ $f(x)to 0 , xto infty$ but $f'(x)$ not tends to 0 as X goes to infinity.
.

I had already proved that in case of f'' is bounded then we can show that above is not true. I tried to get some examples.but not succeed. Any help will be appreciated










share|cite|improve this question


















  • 3




    Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
    – user254433
    Nov 13 at 9:44










  • How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
    – molarmass
    Nov 13 at 9:57















up vote
0
down vote

favorite












I am interested in finding Example of twice differentiable function on $(0,infty)$ $f(x)to 0 , xto infty$ but $f'(x)$ not tends to 0 as X goes to infinity.
.

I had already proved that in case of f'' is bounded then we can show that above is not true. I tried to get some examples.but not succeed. Any help will be appreciated










share|cite|improve this question


















  • 3




    Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
    – user254433
    Nov 13 at 9:44










  • How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
    – molarmass
    Nov 13 at 9:57













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am interested in finding Example of twice differentiable function on $(0,infty)$ $f(x)to 0 , xto infty$ but $f'(x)$ not tends to 0 as X goes to infinity.
.

I had already proved that in case of f'' is bounded then we can show that above is not true. I tried to get some examples.but not succeed. Any help will be appreciated










share|cite|improve this question













I am interested in finding Example of twice differentiable function on $(0,infty)$ $f(x)to 0 , xto infty$ but $f'(x)$ not tends to 0 as X goes to infinity.
.

I had already proved that in case of f'' is bounded then we can show that above is not true. I tried to get some examples.but not succeed. Any help will be appreciated







real-analysis derivatives examples-counterexamples






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Nov 13 at 9:41









Shubham

1,1951518




1,1951518








  • 3




    Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
    – user254433
    Nov 13 at 9:44










  • How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
    – molarmass
    Nov 13 at 9:57














  • 3




    Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
    – user254433
    Nov 13 at 9:44










  • How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
    – molarmass
    Nov 13 at 9:57








3




3




Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
– user254433
Nov 13 at 9:44




Oscillatory functions in general. $f(x)=sin (x^2)/x$ is a nice example.
– user254433
Nov 13 at 9:44












How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
– molarmass
Nov 13 at 9:57




How about $f(x) = frac{sin(x^2)}{x}$? We have $f'(x)=2 cos(x^2) -frac{sin(x^2)}{x^2}$, so $lim_{xtoinfty} f(x)=0$, but $lim_{xtoinfty}f'(x)$ does not exist.
– molarmass
Nov 13 at 9:57










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.



For example, taking $frac 1x$, which goes to infinity, and $sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $sin x$ also oscillates, but $frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $frac 1x sin(x^2)$ is a counterexample. So is $frac 1x sin x^3$.



For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) sin g(x)$ is a counterexample to your assertion.





As you have noted , $f''$ being bounded implies that if $f(x) to 0$ then $f'(x) to 0$ as $x to infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.





Answer to the exercise



Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,infty)$, continuous on $[a,infty)$ , and $f'$ is uniformly continuous on $(a,infty)$, then $lim_{x to infty} f(x) = a < infty$ implies that $lim_{x to infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $lim_{x to infty}f(x)$ change : the derivative removes constants, so it will not change.



We prove this by contradiction. Suppose that $lim_{x to infty} f'(x) neq 0$.We negate the definition of limit equalling zero, to get : there exists $epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > epsilon$.



Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -epsilon$, or the set of points for which $f' > epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > epsilon$ for all $i$.



Now, we will see what happens if $f'$ is just continuous.




Since $f'$ is continuous at each $x_i$, there exists $delta_i$ depending on $x_i$, such that $|y - x_i| < delta_i$ implies $f'(y) > epsilon$.




Now, what happens under uniform continuity?




Since $f'$ is uniformly continuous, there exists $delta$ not depending on $x_i$, such that $|y - x_i| < delta$ implies $f'(y) > epsilon$.




Ok, so what extra is uniform continuity giving us? Not clear so far.



Consider the quantity $D_i = int_{a}^{x_i + delta} f' - int_{a}^{x_i} f' = int_{x_i}^{x_i + delta} f'$. Since $f' > epsilon$ on these intervals, we see that $D_i > epsilon delta$ for all $i$.



However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + delta) - f(x_i)$. Therefore, since $lim_{x to infty} f(x) = alpha$, the limit of the RHS exists and equals zero. Consequently , $lim_{i to infty} D_i = 0$. But this can't happen : $D_i > epsilon delta$, so it can't get closer than this to zero! Contradiction.



What happens if we change back to just continuity? The problem is that $D_i$ is now $int_{x_i}^{x_i + delta_i} f'$. The bound created is $D_i >epsilondelta_i$. Since $delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.






share|cite|improve this answer























  • Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
    – Shubham
    Nov 13 at 10:15










  • Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
    – астон вілла олоф мэллбэрг
    Nov 13 at 10:24












  • If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
    – астон вілла олоф мэллбэрг
    Nov 13 at 11:07










  • Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
    – Shubham
    Nov 13 at 15:16












  • I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
    – астон вілла олоф мэллбэрг
    Nov 13 at 16:56













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996529%2fconstruction-of-function-with-function-going-to-0-but-derivative-not-as-x-tends%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.



For example, taking $frac 1x$, which goes to infinity, and $sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $sin x$ also oscillates, but $frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $frac 1x sin(x^2)$ is a counterexample. So is $frac 1x sin x^3$.



For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) sin g(x)$ is a counterexample to your assertion.





As you have noted , $f''$ being bounded implies that if $f(x) to 0$ then $f'(x) to 0$ as $x to infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.





Answer to the exercise



Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,infty)$, continuous on $[a,infty)$ , and $f'$ is uniformly continuous on $(a,infty)$, then $lim_{x to infty} f(x) = a < infty$ implies that $lim_{x to infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $lim_{x to infty}f(x)$ change : the derivative removes constants, so it will not change.



We prove this by contradiction. Suppose that $lim_{x to infty} f'(x) neq 0$.We negate the definition of limit equalling zero, to get : there exists $epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > epsilon$.



Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -epsilon$, or the set of points for which $f' > epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > epsilon$ for all $i$.



Now, we will see what happens if $f'$ is just continuous.




Since $f'$ is continuous at each $x_i$, there exists $delta_i$ depending on $x_i$, such that $|y - x_i| < delta_i$ implies $f'(y) > epsilon$.




Now, what happens under uniform continuity?




Since $f'$ is uniformly continuous, there exists $delta$ not depending on $x_i$, such that $|y - x_i| < delta$ implies $f'(y) > epsilon$.




Ok, so what extra is uniform continuity giving us? Not clear so far.



Consider the quantity $D_i = int_{a}^{x_i + delta} f' - int_{a}^{x_i} f' = int_{x_i}^{x_i + delta} f'$. Since $f' > epsilon$ on these intervals, we see that $D_i > epsilon delta$ for all $i$.



However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + delta) - f(x_i)$. Therefore, since $lim_{x to infty} f(x) = alpha$, the limit of the RHS exists and equals zero. Consequently , $lim_{i to infty} D_i = 0$. But this can't happen : $D_i > epsilon delta$, so it can't get closer than this to zero! Contradiction.



What happens if we change back to just continuity? The problem is that $D_i$ is now $int_{x_i}^{x_i + delta_i} f'$. The bound created is $D_i >epsilondelta_i$. Since $delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.






share|cite|improve this answer























  • Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
    – Shubham
    Nov 13 at 10:15










  • Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
    – астон вілла олоф мэллбэрг
    Nov 13 at 10:24












  • If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
    – астон вілла олоф мэллбэрг
    Nov 13 at 11:07










  • Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
    – Shubham
    Nov 13 at 15:16












  • I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
    – астон вілла олоф мэллбэрг
    Nov 13 at 16:56

















up vote
1
down vote



accepted










The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.



For example, taking $frac 1x$, which goes to infinity, and $sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $sin x$ also oscillates, but $frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $frac 1x sin(x^2)$ is a counterexample. So is $frac 1x sin x^3$.



For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) sin g(x)$ is a counterexample to your assertion.





As you have noted , $f''$ being bounded implies that if $f(x) to 0$ then $f'(x) to 0$ as $x to infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.





Answer to the exercise



Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,infty)$, continuous on $[a,infty)$ , and $f'$ is uniformly continuous on $(a,infty)$, then $lim_{x to infty} f(x) = a < infty$ implies that $lim_{x to infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $lim_{x to infty}f(x)$ change : the derivative removes constants, so it will not change.



We prove this by contradiction. Suppose that $lim_{x to infty} f'(x) neq 0$.We negate the definition of limit equalling zero, to get : there exists $epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > epsilon$.



Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -epsilon$, or the set of points for which $f' > epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > epsilon$ for all $i$.



Now, we will see what happens if $f'$ is just continuous.




Since $f'$ is continuous at each $x_i$, there exists $delta_i$ depending on $x_i$, such that $|y - x_i| < delta_i$ implies $f'(y) > epsilon$.




Now, what happens under uniform continuity?




Since $f'$ is uniformly continuous, there exists $delta$ not depending on $x_i$, such that $|y - x_i| < delta$ implies $f'(y) > epsilon$.




Ok, so what extra is uniform continuity giving us? Not clear so far.



Consider the quantity $D_i = int_{a}^{x_i + delta} f' - int_{a}^{x_i} f' = int_{x_i}^{x_i + delta} f'$. Since $f' > epsilon$ on these intervals, we see that $D_i > epsilon delta$ for all $i$.



However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + delta) - f(x_i)$. Therefore, since $lim_{x to infty} f(x) = alpha$, the limit of the RHS exists and equals zero. Consequently , $lim_{i to infty} D_i = 0$. But this can't happen : $D_i > epsilon delta$, so it can't get closer than this to zero! Contradiction.



What happens if we change back to just continuity? The problem is that $D_i$ is now $int_{x_i}^{x_i + delta_i} f'$. The bound created is $D_i >epsilondelta_i$. Since $delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.






share|cite|improve this answer























  • Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
    – Shubham
    Nov 13 at 10:15










  • Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
    – астон вілла олоф мэллбэрг
    Nov 13 at 10:24












  • If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
    – астон вілла олоф мэллбэрг
    Nov 13 at 11:07










  • Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
    – Shubham
    Nov 13 at 15:16












  • I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
    – астон вілла олоф мэллбэрг
    Nov 13 at 16:56















up vote
1
down vote



accepted







up vote
1
down vote



accepted






The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.



For example, taking $frac 1x$, which goes to infinity, and $sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $sin x$ also oscillates, but $frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $frac 1x sin(x^2)$ is a counterexample. So is $frac 1x sin x^3$.



For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) sin g(x)$ is a counterexample to your assertion.





As you have noted , $f''$ being bounded implies that if $f(x) to 0$ then $f'(x) to 0$ as $x to infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.





Answer to the exercise



Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,infty)$, continuous on $[a,infty)$ , and $f'$ is uniformly continuous on $(a,infty)$, then $lim_{x to infty} f(x) = a < infty$ implies that $lim_{x to infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $lim_{x to infty}f(x)$ change : the derivative removes constants, so it will not change.



We prove this by contradiction. Suppose that $lim_{x to infty} f'(x) neq 0$.We negate the definition of limit equalling zero, to get : there exists $epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > epsilon$.



Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -epsilon$, or the set of points for which $f' > epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > epsilon$ for all $i$.



Now, we will see what happens if $f'$ is just continuous.




Since $f'$ is continuous at each $x_i$, there exists $delta_i$ depending on $x_i$, such that $|y - x_i| < delta_i$ implies $f'(y) > epsilon$.




Now, what happens under uniform continuity?




Since $f'$ is uniformly continuous, there exists $delta$ not depending on $x_i$, such that $|y - x_i| < delta$ implies $f'(y) > epsilon$.




Ok, so what extra is uniform continuity giving us? Not clear so far.



Consider the quantity $D_i = int_{a}^{x_i + delta} f' - int_{a}^{x_i} f' = int_{x_i}^{x_i + delta} f'$. Since $f' > epsilon$ on these intervals, we see that $D_i > epsilon delta$ for all $i$.



However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + delta) - f(x_i)$. Therefore, since $lim_{x to infty} f(x) = alpha$, the limit of the RHS exists and equals zero. Consequently , $lim_{i to infty} D_i = 0$. But this can't happen : $D_i > epsilon delta$, so it can't get closer than this to zero! Contradiction.



What happens if we change back to just continuity? The problem is that $D_i$ is now $int_{x_i}^{x_i + delta_i} f'$. The bound created is $D_i >epsilondelta_i$. Since $delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.






share|cite|improve this answer














The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.



For example, taking $frac 1x$, which goes to infinity, and $sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $sin x$ also oscillates, but $frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $frac 1x sin(x^2)$ is a counterexample. So is $frac 1x sin x^3$.



For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) sin g(x)$ is a counterexample to your assertion.





As you have noted , $f''$ being bounded implies that if $f(x) to 0$ then $f'(x) to 0$ as $x to infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.





Answer to the exercise



Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,infty)$, continuous on $[a,infty)$ , and $f'$ is uniformly continuous on $(a,infty)$, then $lim_{x to infty} f(x) = a < infty$ implies that $lim_{x to infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $lim_{x to infty}f(x)$ change : the derivative removes constants, so it will not change.



We prove this by contradiction. Suppose that $lim_{x to infty} f'(x) neq 0$.We negate the definition of limit equalling zero, to get : there exists $epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > epsilon$.



Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -epsilon$, or the set of points for which $f' > epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > epsilon$ for all $i$.



Now, we will see what happens if $f'$ is just continuous.




Since $f'$ is continuous at each $x_i$, there exists $delta_i$ depending on $x_i$, such that $|y - x_i| < delta_i$ implies $f'(y) > epsilon$.




Now, what happens under uniform continuity?




Since $f'$ is uniformly continuous, there exists $delta$ not depending on $x_i$, such that $|y - x_i| < delta$ implies $f'(y) > epsilon$.




Ok, so what extra is uniform continuity giving us? Not clear so far.



Consider the quantity $D_i = int_{a}^{x_i + delta} f' - int_{a}^{x_i} f' = int_{x_i}^{x_i + delta} f'$. Since $f' > epsilon$ on these intervals, we see that $D_i > epsilon delta$ for all $i$.



However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + delta) - f(x_i)$. Therefore, since $lim_{x to infty} f(x) = alpha$, the limit of the RHS exists and equals zero. Consequently , $lim_{i to infty} D_i = 0$. But this can't happen : $D_i > epsilon delta$, so it can't get closer than this to zero! Contradiction.



What happens if we change back to just continuity? The problem is that $D_i$ is now $int_{x_i}^{x_i + delta_i} f'$. The bound created is $D_i >epsilondelta_i$. Since $delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 13 at 16:56

























answered Nov 13 at 9:55









астон вілла олоф мэллбэрг

36.1k33375




36.1k33375












  • Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
    – Shubham
    Nov 13 at 10:15










  • Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
    – астон вілла олоф мэллбэрг
    Nov 13 at 10:24












  • If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
    – астон вілла олоф мэллбэрг
    Nov 13 at 11:07










  • Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
    – Shubham
    Nov 13 at 15:16












  • I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
    – астон вілла олоф мэллбэрг
    Nov 13 at 16:56




















  • Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
    – Shubham
    Nov 13 at 10:15










  • Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
    – астон вілла олоф мэллбэрг
    Nov 13 at 10:24












  • If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
    – астон вілла олоф мэллбэрг
    Nov 13 at 11:07










  • Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
    – Shubham
    Nov 13 at 15:16












  • I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
    – астон вілла олоф мэллбэрг
    Nov 13 at 16:56


















Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
– Shubham
Nov 13 at 10:15




Thanks for giving me nice excercise. I had solved first but I had difficulty with weaker condition that f' is uniform continuous .can you give me some hint?
– Shubham
Nov 13 at 10:15












Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
– астон вілла олоф мэллбэрг
Nov 13 at 10:24






Sure. Suppose for a contradiction to $lim f' = 0$, that $f'$ is above $epsilon$ for infinitely many points. This gives us infinitely many intervals in which $f'$ is greater than $epsilon$ by continuity, and the sum of the lengths of these intervals is infinite, by uniform continuity. So the integral of $f'$ over the union of these intervals should be infinity, since $f'$ dominates the constant function $epsilon$ on the union which has infinite intergal. But the integral of $f'$ over an interval is related to $f$, via the fundamental theorem of calculus. Find a contradiction.
– астон вілла олоф мэллбэрг
Nov 13 at 10:24














If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
– астон вілла олоф мэллбэрг
Nov 13 at 11:07




If you are done, give me a reply. Else, ask me to edit the answer : this is the famous Barbalat lemma that you are now trying to prove, and it should get a good proof on the site.
– астон вілла олоф мэллбэрг
Nov 13 at 11:07












Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
– Shubham
Nov 13 at 15:16






Sir thanks. But I don't understand why uniform continuity implies sum of interval became infinite. I am thinking even continuity will give same. Please tell me where is I am missing
– Shubham
Nov 13 at 15:16














I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
– астон вілла олоф мэллбэрг
Nov 13 at 16:56






I apologize, I thought wrong. I have added a proof of the lemma, but the approach is different. I have highlighted where uniform continuity has been used, though.
– астон вілла олоф мэллбэрг
Nov 13 at 16:56




















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2996529%2fconstruction-of-function-with-function-going-to-0-but-derivative-not-as-x-tends%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Probability when a professor distributes a quiz and homework assignment to a class of n students.

Aardman Animations

Are they similar matrix