Is there a “greatest function” that converges?












39












$begingroup$


We just hit convergence tests in calculus, and learned that $sum_{n=1}^{infty} frac{1}{n^p}$
converges for all $p gt 1$. I thought that this was sort of a "barrier" between what converges and what diverges. Specifically, that setting $a_n=frac{1}{n^{1+epsilon}}$ is sort of the "greatest function" (I'll make this precise later) for which $sum a_n$ converges.



But, I did realize that there are functions that dominate $frac{1}{n^{1+epsilon}}$ but not $frac1n$, such as $frac{1}{nlog(n)}$. Now, the sum of that specific example diverges, but it got me wondering about whether $frac{1}{n}$ is truly the "boundary". So, this leads me to two questions.



1) Is there a function $f$ that dominates $frac{1}{n^p}$ for all $p>1$, meaning: $$lim_{xtoinfty} frac{f(x)}{frac{1}{x^p}}=infty$$
Such that:
$$sum_{n=1}^infty f(n)$$ converges?



2) If so, up to a constant is there a function $g$ such that $sum_{n=1}^infty g(n)$ converges, such that $g$ dominates $f$ for all other functions $f$ such that $sum_{n=1}^infty f(n)$ converges?



I'm just a freshman in high school so I apologize if this is a stupid question.










share|cite|improve this question









$endgroup$








  • 2




    $begingroup$
    I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
    $endgroup$
    – Electric Moccasins
    Feb 5 at 21:16












  • $begingroup$
    How about $f(x) = frac{1}{xln^2(x)}$ for your first question
    $endgroup$
    – Jakobian
    Feb 5 at 21:19








  • 9




    $begingroup$
    It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
    $endgroup$
    – Henning Makholm
    Feb 5 at 21:21






  • 1




    $begingroup$
    From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
    $endgroup$
    – Martin R
    Feb 5 at 21:22






  • 1




    $begingroup$
    There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
    $endgroup$
    – Rahul
    Feb 6 at 11:36
















39












$begingroup$


We just hit convergence tests in calculus, and learned that $sum_{n=1}^{infty} frac{1}{n^p}$
converges for all $p gt 1$. I thought that this was sort of a "barrier" between what converges and what diverges. Specifically, that setting $a_n=frac{1}{n^{1+epsilon}}$ is sort of the "greatest function" (I'll make this precise later) for which $sum a_n$ converges.



But, I did realize that there are functions that dominate $frac{1}{n^{1+epsilon}}$ but not $frac1n$, such as $frac{1}{nlog(n)}$. Now, the sum of that specific example diverges, but it got me wondering about whether $frac{1}{n}$ is truly the "boundary". So, this leads me to two questions.



1) Is there a function $f$ that dominates $frac{1}{n^p}$ for all $p>1$, meaning: $$lim_{xtoinfty} frac{f(x)}{frac{1}{x^p}}=infty$$
Such that:
$$sum_{n=1}^infty f(n)$$ converges?



2) If so, up to a constant is there a function $g$ such that $sum_{n=1}^infty g(n)$ converges, such that $g$ dominates $f$ for all other functions $f$ such that $sum_{n=1}^infty f(n)$ converges?



I'm just a freshman in high school so I apologize if this is a stupid question.










share|cite|improve this question









$endgroup$








  • 2




    $begingroup$
    I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
    $endgroup$
    – Electric Moccasins
    Feb 5 at 21:16












  • $begingroup$
    How about $f(x) = frac{1}{xln^2(x)}$ for your first question
    $endgroup$
    – Jakobian
    Feb 5 at 21:19








  • 9




    $begingroup$
    It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
    $endgroup$
    – Henning Makholm
    Feb 5 at 21:21






  • 1




    $begingroup$
    From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
    $endgroup$
    – Martin R
    Feb 5 at 21:22






  • 1




    $begingroup$
    There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
    $endgroup$
    – Rahul
    Feb 6 at 11:36














39












39








39


6



$begingroup$


We just hit convergence tests in calculus, and learned that $sum_{n=1}^{infty} frac{1}{n^p}$
converges for all $p gt 1$. I thought that this was sort of a "barrier" between what converges and what diverges. Specifically, that setting $a_n=frac{1}{n^{1+epsilon}}$ is sort of the "greatest function" (I'll make this precise later) for which $sum a_n$ converges.



But, I did realize that there are functions that dominate $frac{1}{n^{1+epsilon}}$ but not $frac1n$, such as $frac{1}{nlog(n)}$. Now, the sum of that specific example diverges, but it got me wondering about whether $frac{1}{n}$ is truly the "boundary". So, this leads me to two questions.



1) Is there a function $f$ that dominates $frac{1}{n^p}$ for all $p>1$, meaning: $$lim_{xtoinfty} frac{f(x)}{frac{1}{x^p}}=infty$$
Such that:
$$sum_{n=1}^infty f(n)$$ converges?



2) If so, up to a constant is there a function $g$ such that $sum_{n=1}^infty g(n)$ converges, such that $g$ dominates $f$ for all other functions $f$ such that $sum_{n=1}^infty f(n)$ converges?



I'm just a freshman in high school so I apologize if this is a stupid question.










share|cite|improve this question









$endgroup$




We just hit convergence tests in calculus, and learned that $sum_{n=1}^{infty} frac{1}{n^p}$
converges for all $p gt 1$. I thought that this was sort of a "barrier" between what converges and what diverges. Specifically, that setting $a_n=frac{1}{n^{1+epsilon}}$ is sort of the "greatest function" (I'll make this precise later) for which $sum a_n$ converges.



But, I did realize that there are functions that dominate $frac{1}{n^{1+epsilon}}$ but not $frac1n$, such as $frac{1}{nlog(n)}$. Now, the sum of that specific example diverges, but it got me wondering about whether $frac{1}{n}$ is truly the "boundary". So, this leads me to two questions.



1) Is there a function $f$ that dominates $frac{1}{n^p}$ for all $p>1$, meaning: $$lim_{xtoinfty} frac{f(x)}{frac{1}{x^p}}=infty$$
Such that:
$$sum_{n=1}^infty f(n)$$ converges?



2) If so, up to a constant is there a function $g$ such that $sum_{n=1}^infty g(n)$ converges, such that $g$ dominates $f$ for all other functions $f$ such that $sum_{n=1}^infty f(n)$ converges?



I'm just a freshman in high school so I apologize if this is a stupid question.







calculus sequences-and-series convergence






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Feb 5 at 21:11









Electric MoccasinsElectric Moccasins

22126




22126








  • 2




    $begingroup$
    I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
    $endgroup$
    – Electric Moccasins
    Feb 5 at 21:16












  • $begingroup$
    How about $f(x) = frac{1}{xln^2(x)}$ for your first question
    $endgroup$
    – Jakobian
    Feb 5 at 21:19








  • 9




    $begingroup$
    It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
    $endgroup$
    – Henning Makholm
    Feb 5 at 21:21






  • 1




    $begingroup$
    From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
    $endgroup$
    – Martin R
    Feb 5 at 21:22






  • 1




    $begingroup$
    There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
    $endgroup$
    – Rahul
    Feb 6 at 11:36














  • 2




    $begingroup$
    I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
    $endgroup$
    – Electric Moccasins
    Feb 5 at 21:16












  • $begingroup$
    How about $f(x) = frac{1}{xln^2(x)}$ for your first question
    $endgroup$
    – Jakobian
    Feb 5 at 21:19








  • 9




    $begingroup$
    It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
    $endgroup$
    – Henning Makholm
    Feb 5 at 21:21






  • 1




    $begingroup$
    From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
    $endgroup$
    – Martin R
    Feb 5 at 21:22






  • 1




    $begingroup$
    There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
    $endgroup$
    – Rahul
    Feb 6 at 11:36








2




2




$begingroup$
I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
$endgroup$
– Electric Moccasins
Feb 5 at 21:16






$begingroup$
I tried to format the question such that it excludes all the "trivial" answers, i.e. "no, because you can just take this function and multiply by any constant to get another function that dominates it and still converges". The constant thing is just excluded by the fact that the limit has to diverge, rather than be equal to some constant, but something similar could still possibly happen. I tried to include a sort of "parameter" on $f$ to patch this, but I couldn't get it formal. But the sort of "philosophy" of the question is about the form of the function, rather than a "cop-out" like that.
$endgroup$
– Electric Moccasins
Feb 5 at 21:16














$begingroup$
How about $f(x) = frac{1}{xln^2(x)}$ for your first question
$endgroup$
– Jakobian
Feb 5 at 21:19






$begingroup$
How about $f(x) = frac{1}{xln^2(x)}$ for your first question
$endgroup$
– Jakobian
Feb 5 at 21:19






9




9




$begingroup$
It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
$endgroup$
– Henning Makholm
Feb 5 at 21:21




$begingroup$
It's a good question; part of a good answer will surely be to identify the most fruitful way to make the vague intuition behind it more precise.
$endgroup$
– Henning Makholm
Feb 5 at 21:21




1




1




$begingroup$
From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
$endgroup$
– Martin R
Feb 5 at 21:22




$begingroup$
From math.stackexchange.com/a/452074/42969: Let $sum_{n=1}^{infty} c_n$ be any convergent series with positive terms. Then, there exists a convergent series $sum_{n=1}^{infty} C_n$ with much bigger terms in the sense that $lim_{nrightarrowinfty} C_n/c_n = infty$.
$endgroup$
– Martin R
Feb 5 at 21:22




1




1




$begingroup$
There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
$endgroup$
– Rahul
Feb 6 at 11:36




$begingroup$
There are lots of other great answers in that linked thread "Is there a slowest rate of divergence of a series?", which I'm sure the OP will find very interesting.
$endgroup$
– Rahul
Feb 6 at 11:36










2 Answers
2






active

oldest

votes


















17












$begingroup$

1) Yes, $f(n)=frac{1}{n(ln{n})^2}$.



2) No. Assume $f geq 0$ and $sum_{n geq 1}{f(n)} < infty$.



Then there exists an increasing sequence $N_n$ and some constant $C > 0$ such that $sum_{N_n+1}^{N_{n+1}}{f(k)} leq C2^{-n}$.



Then set $g(n)=(p+1)f(n)$, where $N_p < n leq N_{p+1}$.



Then $sum_{N_n+1}^{N_{n+1}}{g(k)} leq C(n+1)2^{-n}$, thus $sum_{n geq 1}{g(n)}$ is finite and $g(n) >> f(n)$.






share|cite|improve this answer











$endgroup$





















    7












    $begingroup$

    1) $$f(n) = frac{1}{n log(n)^2}$$



    2) No. Given any $g > 0$ such that $sum_n g(n)$ converges, there is an increasing sequence $M_k$ such that $$sum_{n ge M_k} g(n) < 2^{-k}$$

    Then
    $ sum_n g(n) h(n)$ converges, where $h(n) = k$ for $M_k le n < M_{k+1}$.






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
      $endgroup$
      – orlp
      Feb 5 at 21:22












    • $begingroup$
      Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
      $endgroup$
      – Electric Moccasins
      Feb 6 at 0:54










    • $begingroup$
      @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
      $endgroup$
      – Solomonoff's Secret
      Feb 6 at 1:45












    • $begingroup$
      @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
      $endgroup$
      – Electric Moccasins
      Feb 6 at 2:32












    • $begingroup$
      $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
      $endgroup$
      – Robert Israel
      Feb 6 at 5:23













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3101694%2fis-there-a-greatest-function-that-converges%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    17












    $begingroup$

    1) Yes, $f(n)=frac{1}{n(ln{n})^2}$.



    2) No. Assume $f geq 0$ and $sum_{n geq 1}{f(n)} < infty$.



    Then there exists an increasing sequence $N_n$ and some constant $C > 0$ such that $sum_{N_n+1}^{N_{n+1}}{f(k)} leq C2^{-n}$.



    Then set $g(n)=(p+1)f(n)$, where $N_p < n leq N_{p+1}$.



    Then $sum_{N_n+1}^{N_{n+1}}{g(k)} leq C(n+1)2^{-n}$, thus $sum_{n geq 1}{g(n)}$ is finite and $g(n) >> f(n)$.






    share|cite|improve this answer











    $endgroup$


















      17












      $begingroup$

      1) Yes, $f(n)=frac{1}{n(ln{n})^2}$.



      2) No. Assume $f geq 0$ and $sum_{n geq 1}{f(n)} < infty$.



      Then there exists an increasing sequence $N_n$ and some constant $C > 0$ such that $sum_{N_n+1}^{N_{n+1}}{f(k)} leq C2^{-n}$.



      Then set $g(n)=(p+1)f(n)$, where $N_p < n leq N_{p+1}$.



      Then $sum_{N_n+1}^{N_{n+1}}{g(k)} leq C(n+1)2^{-n}$, thus $sum_{n geq 1}{g(n)}$ is finite and $g(n) >> f(n)$.






      share|cite|improve this answer











      $endgroup$
















        17












        17








        17





        $begingroup$

        1) Yes, $f(n)=frac{1}{n(ln{n})^2}$.



        2) No. Assume $f geq 0$ and $sum_{n geq 1}{f(n)} < infty$.



        Then there exists an increasing sequence $N_n$ and some constant $C > 0$ such that $sum_{N_n+1}^{N_{n+1}}{f(k)} leq C2^{-n}$.



        Then set $g(n)=(p+1)f(n)$, where $N_p < n leq N_{p+1}$.



        Then $sum_{N_n+1}^{N_{n+1}}{g(k)} leq C(n+1)2^{-n}$, thus $sum_{n geq 1}{g(n)}$ is finite and $g(n) >> f(n)$.






        share|cite|improve this answer











        $endgroup$



        1) Yes, $f(n)=frac{1}{n(ln{n})^2}$.



        2) No. Assume $f geq 0$ and $sum_{n geq 1}{f(n)} < infty$.



        Then there exists an increasing sequence $N_n$ and some constant $C > 0$ such that $sum_{N_n+1}^{N_{n+1}}{f(k)} leq C2^{-n}$.



        Then set $g(n)=(p+1)f(n)$, where $N_p < n leq N_{p+1}$.



        Then $sum_{N_n+1}^{N_{n+1}}{g(k)} leq C(n+1)2^{-n}$, thus $sum_{n geq 1}{g(n)}$ is finite and $g(n) >> f(n)$.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Feb 6 at 1:36









        user12345

        2,0211515




        2,0211515










        answered Feb 5 at 21:21









        MindlackMindlack

        4,815210




        4,815210























            7












            $begingroup$

            1) $$f(n) = frac{1}{n log(n)^2}$$



            2) No. Given any $g > 0$ such that $sum_n g(n)$ converges, there is an increasing sequence $M_k$ such that $$sum_{n ge M_k} g(n) < 2^{-k}$$

            Then
            $ sum_n g(n) h(n)$ converges, where $h(n) = k$ for $M_k le n < M_{k+1}$.






            share|cite|improve this answer









            $endgroup$









            • 1




              $begingroup$
              If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
              $endgroup$
              – orlp
              Feb 5 at 21:22












            • $begingroup$
              Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
              $endgroup$
              – Electric Moccasins
              Feb 6 at 0:54










            • $begingroup$
              @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
              $endgroup$
              – Solomonoff's Secret
              Feb 6 at 1:45












            • $begingroup$
              @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
              $endgroup$
              – Electric Moccasins
              Feb 6 at 2:32












            • $begingroup$
              $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
              $endgroup$
              – Robert Israel
              Feb 6 at 5:23


















            7












            $begingroup$

            1) $$f(n) = frac{1}{n log(n)^2}$$



            2) No. Given any $g > 0$ such that $sum_n g(n)$ converges, there is an increasing sequence $M_k$ such that $$sum_{n ge M_k} g(n) < 2^{-k}$$

            Then
            $ sum_n g(n) h(n)$ converges, where $h(n) = k$ for $M_k le n < M_{k+1}$.






            share|cite|improve this answer









            $endgroup$









            • 1




              $begingroup$
              If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
              $endgroup$
              – orlp
              Feb 5 at 21:22












            • $begingroup$
              Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
              $endgroup$
              – Electric Moccasins
              Feb 6 at 0:54










            • $begingroup$
              @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
              $endgroup$
              – Solomonoff's Secret
              Feb 6 at 1:45












            • $begingroup$
              @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
              $endgroup$
              – Electric Moccasins
              Feb 6 at 2:32












            • $begingroup$
              $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
              $endgroup$
              – Robert Israel
              Feb 6 at 5:23
















            7












            7








            7





            $begingroup$

            1) $$f(n) = frac{1}{n log(n)^2}$$



            2) No. Given any $g > 0$ such that $sum_n g(n)$ converges, there is an increasing sequence $M_k$ such that $$sum_{n ge M_k} g(n) < 2^{-k}$$

            Then
            $ sum_n g(n) h(n)$ converges, where $h(n) = k$ for $M_k le n < M_{k+1}$.






            share|cite|improve this answer









            $endgroup$



            1) $$f(n) = frac{1}{n log(n)^2}$$



            2) No. Given any $g > 0$ such that $sum_n g(n)$ converges, there is an increasing sequence $M_k$ such that $$sum_{n ge M_k} g(n) < 2^{-k}$$

            Then
            $ sum_n g(n) h(n)$ converges, where $h(n) = k$ for $M_k le n < M_{k+1}$.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Feb 5 at 21:21









            Robert IsraelRobert Israel

            326k23215469




            326k23215469








            • 1




              $begingroup$
              If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
              $endgroup$
              – orlp
              Feb 5 at 21:22












            • $begingroup$
              Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
              $endgroup$
              – Electric Moccasins
              Feb 6 at 0:54










            • $begingroup$
              @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
              $endgroup$
              – Solomonoff's Secret
              Feb 6 at 1:45












            • $begingroup$
              @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
              $endgroup$
              – Electric Moccasins
              Feb 6 at 2:32












            • $begingroup$
              $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
              $endgroup$
              – Robert Israel
              Feb 6 at 5:23
















            • 1




              $begingroup$
              If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
              $endgroup$
              – orlp
              Feb 5 at 21:22












            • $begingroup$
              Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
              $endgroup$
              – Electric Moccasins
              Feb 6 at 0:54










            • $begingroup$
              @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
              $endgroup$
              – Solomonoff's Secret
              Feb 6 at 1:45












            • $begingroup$
              @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
              $endgroup$
              – Electric Moccasins
              Feb 6 at 2:32












            • $begingroup$
              $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
              $endgroup$
              – Robert Israel
              Feb 6 at 5:23










            1




            1




            $begingroup$
            If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
            $endgroup$
            – orlp
            Feb 5 at 21:22






            $begingroup$
            If we're being pedantic you need $f(n) = dfrac{1}{nlog(n+1)^2}$ (or something similar) to prevent division by zero.
            $endgroup$
            – orlp
            Feb 5 at 21:22














            $begingroup$
            Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
            $endgroup$
            – Electric Moccasins
            Feb 6 at 0:54




            $begingroup$
            Can you expand on how you know $sum g(n)h(n)$ converges? Just trying to wrap my head around this
            $endgroup$
            – Electric Moccasins
            Feb 6 at 0:54












            $begingroup$
            @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
            $endgroup$
            – Solomonoff's Secret
            Feb 6 at 1:45






            $begingroup$
            @ElectricMoccasins Because $sum g(n) h(n) le sum k 2^{-k}$, which converges.
            $endgroup$
            – Solomonoff's Secret
            Feb 6 at 1:45














            $begingroup$
            @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
            $endgroup$
            – Electric Moccasins
            Feb 6 at 2:32






            $begingroup$
            @Solomonoff'sSecret Ok, I'm really not sure how you're getting that inequality. I get the inequality in Prof. Isreal's answer, but not how it translates to the one you said. Sorry, this is my first ever brush with "hard math".
            $endgroup$
            – Electric Moccasins
            Feb 6 at 2:32














            $begingroup$
            $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
            $endgroup$
            – Robert Israel
            Feb 6 at 5:23






            $begingroup$
            $$sum_{n=M_k}^{M_{k+1}-1} g(n) h(n) = k sum_{n=M_k}^{M_{k+1}-1} g(n) < k 2^{-k}$$
            $endgroup$
            – Robert Israel
            Feb 6 at 5:23




















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3101694%2fis-there-a-greatest-function-that-converges%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Aardman Animations

            Are they similar matrix

            “minimization” problem in Euclidean space related to orthonormal basis