How to show the existence of the limit $lim_{nto infty}frac{x_n}{n}$ if $x_n$ satisfy $x^{-n}=sum_{k=1}^infty...











up vote
7
down vote

favorite
3












Suppose $x_n$ is the only positive solution to the equation $x^{-n}=sumlimits_{k=1}^infty (x+k)^{-n}$,how to show the existence of the limit $lim_{nto infty}frac{x_n}{n}$?



It is easy to see that ${x_n}$ is increasing.In fact, the given euation equals
$$1=sum_{k=1}^infty(1+frac{k}{x})^{-n} tag{*}$$
If $x_nge x_{n+1}$,then notice that for any fixed$ k$,$(1+frac{k}{x})^{-n}$ is increasing,thus we can get
$$frac{1}{(1+frac{k}{x_n})^n}ge frac{1}{(1+frac{k}{x_{n+1}})^n}>frac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
By summing up all k's from 1 to $infty$,we can see
$$sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}>sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
then from $(*)$ we see that the two series in the above equality are all equals to $1$,witch is a contradiction!



But it seems hard for us to show the existence of $lim_{nto infty}frac{x_n}{n}$.What I can see by the area's principle is



$$Big|sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}-int_1^infty frac{1}{(1+frac{x}{x_n})}dxBig|<frac{1}{(1+frac1{x_n})^n}$$
or
$$Big|1-frac{x_n}{n-1}(1+frac{1}{x_n})^{1-n}Big|<frac{1}{(1+frac1{x_n})^n}$$










share|cite|improve this question




























    up vote
    7
    down vote

    favorite
    3












    Suppose $x_n$ is the only positive solution to the equation $x^{-n}=sumlimits_{k=1}^infty (x+k)^{-n}$,how to show the existence of the limit $lim_{nto infty}frac{x_n}{n}$?



    It is easy to see that ${x_n}$ is increasing.In fact, the given euation equals
    $$1=sum_{k=1}^infty(1+frac{k}{x})^{-n} tag{*}$$
    If $x_nge x_{n+1}$,then notice that for any fixed$ k$,$(1+frac{k}{x})^{-n}$ is increasing,thus we can get
    $$frac{1}{(1+frac{k}{x_n})^n}ge frac{1}{(1+frac{k}{x_{n+1}})^n}>frac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
    By summing up all k's from 1 to $infty$,we can see
    $$sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}>sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
    then from $(*)$ we see that the two series in the above equality are all equals to $1$,witch is a contradiction!



    But it seems hard for us to show the existence of $lim_{nto infty}frac{x_n}{n}$.What I can see by the area's principle is



    $$Big|sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}-int_1^infty frac{1}{(1+frac{x}{x_n})}dxBig|<frac{1}{(1+frac1{x_n})^n}$$
    or
    $$Big|1-frac{x_n}{n-1}(1+frac{1}{x_n})^{1-n}Big|<frac{1}{(1+frac1{x_n})^n}$$










    share|cite|improve this question


























      up vote
      7
      down vote

      favorite
      3









      up vote
      7
      down vote

      favorite
      3






      3





      Suppose $x_n$ is the only positive solution to the equation $x^{-n}=sumlimits_{k=1}^infty (x+k)^{-n}$,how to show the existence of the limit $lim_{nto infty}frac{x_n}{n}$?



      It is easy to see that ${x_n}$ is increasing.In fact, the given euation equals
      $$1=sum_{k=1}^infty(1+frac{k}{x})^{-n} tag{*}$$
      If $x_nge x_{n+1}$,then notice that for any fixed$ k$,$(1+frac{k}{x})^{-n}$ is increasing,thus we can get
      $$frac{1}{(1+frac{k}{x_n})^n}ge frac{1}{(1+frac{k}{x_{n+1}})^n}>frac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
      By summing up all k's from 1 to $infty$,we can see
      $$sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}>sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
      then from $(*)$ we see that the two series in the above equality are all equals to $1$,witch is a contradiction!



      But it seems hard for us to show the existence of $lim_{nto infty}frac{x_n}{n}$.What I can see by the area's principle is



      $$Big|sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}-int_1^infty frac{1}{(1+frac{x}{x_n})}dxBig|<frac{1}{(1+frac1{x_n})^n}$$
      or
      $$Big|1-frac{x_n}{n-1}(1+frac{1}{x_n})^{1-n}Big|<frac{1}{(1+frac1{x_n})^n}$$










      share|cite|improve this question















      Suppose $x_n$ is the only positive solution to the equation $x^{-n}=sumlimits_{k=1}^infty (x+k)^{-n}$,how to show the existence of the limit $lim_{nto infty}frac{x_n}{n}$?



      It is easy to see that ${x_n}$ is increasing.In fact, the given euation equals
      $$1=sum_{k=1}^infty(1+frac{k}{x})^{-n} tag{*}$$
      If $x_nge x_{n+1}$,then notice that for any fixed$ k$,$(1+frac{k}{x})^{-n}$ is increasing,thus we can get
      $$frac{1}{(1+frac{k}{x_n})^n}ge frac{1}{(1+frac{k}{x_{n+1}})^n}>frac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
      By summing up all k's from 1 to $infty$,we can see
      $$sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}>sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_{n+1}})^{n+1}}$$
      then from $(*)$ we see that the two series in the above equality are all equals to $1$,witch is a contradiction!



      But it seems hard for us to show the existence of $lim_{nto infty}frac{x_n}{n}$.What I can see by the area's principle is



      $$Big|sum_{k=1}^inftyfrac{1}{(1+frac{k}{x_n})^n}-int_1^infty frac{1}{(1+frac{x}{x_n})}dxBig|<frac{1}{(1+frac1{x_n})^n}$$
      or
      $$Big|1-frac{x_n}{n-1}(1+frac{1}{x_n})^{1-n}Big|<frac{1}{(1+frac1{x_n})^n}$$







      sequences-and-series limits






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Nov 7 at 13:53

























      asked Nov 3 at 13:45









      mbfkk

      331113




      331113






















          4 Answers
          4






          active

          oldest

          votes

















          up vote
          3
          down vote



          accepted
          +50










          For any $n ge 2$, consider the function $displaystyle;Phi_n(x) = sum_{k=1}^infty left(frac{x}{x+k}right)^n$.



          It is easy to see $Phi_n(x)$ is an increasing function over $(0,infty]$.
          For small $x$, it is bounded from above by $x^n zeta(n)$ and hence decreases to $0$ as $x to 0$.
          For large $x$, we can approximate the sum by an integral and $Phi_n(x)$ diverges like $displaystyle;frac{x}{n-1}$ as $x to infty$. By definition, $x_n$ is the unique root for $Phi_n(x_n) = 1$. Let $displaystyle;y_n = frac{x_n}{n}$.



          For any $alpha > 0$, apply AM $ge$ GM to $n$ copies of $1 + frac{alpha}{n}$ and one copy of $1$, we obtain



          $$left(1 + frac{alpha}{n}right)^{n/n+1} > frac1{n+1} left[nleft(1 + frac{alpha}{n}right) + 1 right] = 1 + frac{alpha}{n+1}$$
          The inequality is strict because the $n+1$ numbers are not identical. Taking reciprocal on both sides, we get
          $$left( frac{n}{n + alpha} right)^n ge left(frac{n+1}{n+1 + alpha}right)^{n+1}
          $$



          Replace $alpha$ by $displaystyle;frac{k}{y_n}$ for generic positive integer $k$, we obtain



          $$left( frac{x_n}{x_n + k} right)^n = left( frac{n y_n}{n y_n + k} right)^n > left(frac{(n+1)y_n}{(n+1)y_n + k}right)^{n+1}$$
          Summing over $k$ and using definition of $x_n$, we find



          $$Phi_{n+1}(x_{n+1}) = 1 = Phi_n(x_n) > Phi_{n+1}((n+1)y_n)$$



          Since $Phi_{n+1}$ is increasing, we obtain $x_{n+1} > (n+1)y_n iff y_{n+1} > y_n$.
          This means $y_n$ is an increasing sequence.



          We are going to show $y_n$ is bounded from above by $frac32$
          (see update below for a more elementary and better upper bound).
          For simplicity, let us abberivate $x_n$ and $y_n$ as $x$ and $y$. By their definition, we have



          $$frac{2}{x^n} = sum_{k=0}^infty frac{1}{(x+k)^n}$$



          By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is



          $$begin{align}frac{3}{2x^n} &= int_0^infty frac{dk}{(x+k)^n} +
          i int_0^infty frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2pi t} - 1} dt\
          &=frac{1}{(n-1)x^{n-1}}
          + frac{1}{x^{n-1}}int_0^infty frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2pi x s}-1} ds
          end{align}
          $$

          Multiply both sides by $nx^{n-1}$ and replace $s$ by $s/n$, we obtain



          $$begin{align}frac{3}{2y} - frac{n}{n-1} &=
          i int_0^infty frac{(1 + ifrac{s}{n})^{-n} - (1-ifrac{s}{n})^{-n}}{e^{2pi ys} - 1} ds\
          &= 2int_0^infty frac{sinleft(ntan^{-1}left(frac{s}{n}right)right)}{left(1 + frac{t^2}{n^2}right)^{n/2}} frac{ds}{e^{2pi ys}-1}tag{*1}
          end{align}
          $$

          For the integral on RHS, if we want its integrand to be negative, we need



          $$ntan^{-1}left(frac{s}{n}right) > pi
          implies frac{s}{n} > tanleft(frac{pi}{n}right) implies s > pi$$



          By the time $s$ reaches $pi$, the factor $frac{1}{e^{2pi ys} - 1}$ already drops to very small. Numerically, we know $y_4 > 1$, so for $n ge 4$ and $s ge pi$, we have



          $$frac{1}{e^{2pi ys} - 1} le frac{1}{e^{2pi^2} - 1} approx 2.675 times 10^{-9}$$



          This implies the integral is positive. For $n ge 4$, we can deduce



          $$frac{3}{2y} ge frac{n}{n-1} implies y_n le frac32left(1 - frac1nright) < frac32$$



          Since $y_n$ is increasing and bounded from above by $frac32$, limit
          $y_infty stackrel{def}{=} lim_{ntoinfty} y_n$ exists and $le frac32$.



          For fixed $y > 0$, with help of DCT, one can show the last integral of $(*1)$
          converges.

          This suggests $y_infty$ is a root of following equation near $frac32$



          $$frac{3}{2y} = 1 + 2int_0^infty frac{sin(s)}{e^{2pi ys} - 1} ds$$



          According to DLMF,
          $$int_0^infty e^{-x} frac{sin(ax)}{sinh x} dx = frac{pi}{2}cothleft(frac{pi a}{2}right) - frac1aquadtext{ for }quad a ne 0$$



          We can transform our equation to



          $$frac{3}{2y} = 1 + 2left[frac{1}{4y}cothleft(frac{1}{2y}right) - frac12right]
          iff cothleft(frac{1}{2y}right) = 3$$



          This leads to $displaystyle;y_infty = frac{1}{log 2}$.



          This is consistent with the finding of another answer (currently deleted):




          If $L_infty = lim_{ntoinfty}frac{n}{x_n}$ exists, then $L_infty = log 2$.




          To summarize, the limit $displaystyle;frac{x_n}{n}$ exists and should equal to $displaystyle;frac{1}{log 2}$.





          Update



          It turns out there is a more elementary proof that $y_n$ is bounded from above by the optimal bound $displaystyle;frac{1}{log 2}$.



          Recall for any $alpha > 0$. we have $1 + alpha < e^alpha$. Substitute
          $alpha$ by $frac{k}{n}log 2$ for $n ge 2$ and $k ge 1$, we get



          $$frac{n}{n + klog 2} = frac{1}{1 + frac{k}{n}log 2} > e^{-frac{k}{n}log 2} = 2^{-frac{k}{n}}$$



          This leads to



          $$Phi_nleft(frac{n}{log 2}right)
          = sum_{k=1}^infty left(frac{n}{n + log 2 k}right)^n
          > sum_{k=1}^infty 2^{-k}
          = 1 = Phi_n(x_n)
          $$

          Since $Phi_n(x)$ is increasing, this means
          $displaystyle;frac{n}{log 2} > x_n$ and $y_n$ is bounded from above by $displaystyle;frac{1}{log 2}$.






          share|cite|improve this answer























          • Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
            – mbfkk
            Nov 8 at 11:28










          • @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
            – achille hui
            Nov 8 at 11:44










          • I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
            – mbfkk
            Nov 9 at 8:44


















          up vote
          2
          down vote













          Consider the functions
          $$f_n(x):=sum_{k=1}^inftyleft(frac{x}{x+k}right)^n.$$
          (The series should converge for every fixed $xgeq 0$ and $ngeq 2$.)
          Then the values $x_n$ are the solutions of
          $$f_n(x)=1.$$
          We have that $f_n(0)=0$ and because of
          $$f_n'(x)=sum_{k=1}^{infty}nleft(frac{x}{x+k}right)^{n-1}frac{k}{(x+k)^2},$$
          we have $f'_n(x)>0$ for $x>0$.
          Moreover
          $$f_n(3n)=sum_{k=1}^{infty}left(frac{3n}{3n+k}right)^ngeq3left(frac{3n}{3n+3}right)^n=3left(1+frac{1}{n}right)^{-n}.$$
          Since $lim_{ntoinfty}(1+frac{1}{n})^n=e$ we have $$lim_{ntoinfty}f_n(3n)geqfrac{3}{e}>1$$ and there exists $Ninmathbb N$, such that
          $$f_n(3n)>1$$
          for all $ngeq N$.



          Thus, for large enough $n$ we have $x_nin(0,3n)$ and
          $$0leqlim_{ntoinfty}frac{x_n}{n}leq 3$$






          share|cite|improve this answer






























            up vote
            2
            down vote













            Below is my thought of proving $limlimits_{nto infty}frac{x_n}{n}=frac{1}{ln 2}$.



            For any $lambda >0$,
            begin{align*}
            Phi_n(lambda n)=sum_{k=1}^infty left( frac{lambda n}{lambda n+k}right)^n
            end{align*}

            We denote $a_{n,k}=left( frac{lambda n}{lambda n+k}right)^n$,it's easy to verify that $a_{n,k}$ is decreasing for $n$,and
            begin{align*}
            lim_{nto infty}a_{n,k}=e^{-k/lambda}triangleq b_k
            end{align*}

            We notice that $sum_{k=1}^infty b_k=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}$,$a_{n,k}<a_{2,k}$,$ngeq 2$,$sum a_{2,k}$is convergent.Meanwhile ,we can verify the following proposition(A similar to Lebesgue's dominated convergent theorem)



            Suppose${a_{n,k}}$is a positive binary index sequence,and for all $kin mathbb{N}_+$we have
            $a_{n,k}to b_k$,$ntoinfty$,besides $|a_{n,k}|<a_k$, $sum_{k=1}^infty a_k$ is convergent.Then
            begin{align*}
            lim_{nto infty}sum_{k=1}^infty a_{n,k}=sum_{k=1}^infty b_k
            end{align*}



            So thanks to the above proposition can see
            begin{align*}
            lim_{nto infty}Phi_n(lambda n)=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}
            end{align*}



            Specially,we take $lambda=frac{1}{ln 2}$,then $lim_{nto infty}Phi_nleft(frac{ n}{ln 2}right)=1=Phi_n(x_n)$.Thus for all $s>frac{1}{ln 2}$,since
            begin{align*}
            lim_{nto infty }Phi_n(s n)=frac{1}{e^{1/s}-1}>1=lim_{nto infty}Phi_n(x_n)
            end{align*}

            we see that there exists $N$,such that for all$ n>N$,
            begin{align*}
            Phi_n(s n)>Phi_n(x_n)Rightarrow sn>x_n,forall n>N
            end{align*}

            This implies $A=limlimits_{nto infty }y_nleqslant s$,thus $Aleqslant frac{1}{ln 2}$.Similarly we can prove $Ageqslant frac{1}{ln 2}$,and finally we get $A=frac{1}{ln 2}$.






            share|cite|improve this answer























            • (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
              – achille hui
              Nov 9 at 11:00




















            up vote
            0
            down vote













            We can rewrite $$x^{-n} = sum_{k=1}^infty (x+k)^{-n}$$



            as



            $$1= sum_{k=1}^infty e^{- nln (1+ k/x_n)}.$$



            Now



            $ln (1+k/x_n) le k/x_n$, therefore



            $$1 le sum_{k=1}^infty e^{-frac{n}{x_n}k} = frac{1}{e^{n/x_n}-1}.$$



            From this it follows that



            $$ (*) quad n /x_n ge ln 2.$$



            Suppose now that $limsup_{ntoinfty} n/x_n=M>c$. Then for all $n$ large, we have $n/x_n>c$ and



            begin{align*} 1 &= sum_{k=1}^infty e^{-n ln (1+frac{k}{n} times frac{n}{x_n})}\
            & le sum_{k=1}^infty e^{-n ln (1+ frac{k}{n} c)}\
            & = sum_{k=1}^infty (1+frac{k}{n}c)^{-n} \
            & to sum_{k=1}^infty e^{-kc}=frac{1}{e^c-1}.
            end{align*}

            by dominated convergence (note: $(1+frac{k}{n}c)^{-n} le (1+frac{kc}{2})^{-2}$).

            Thus, $e^c-1 le 1$, or $c le ln 2$. It follows that



            $$(**) quad limsup n/x_n le ln 2.$$



            Now $(*)$ and $(**)$ give



            $$lim_{ntoinfty} frac{x_n}{n} = sup_{n} frac{x_n}{n} = frac{1}{ln 2}.$$






            share|cite|improve this answer





















              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














               

              draft saved


              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2982897%2fhow-to-show-the-existence-of-the-limit-lim-n-to-infty-fracx-nn-if-x-n%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              4 Answers
              4






              active

              oldest

              votes








              4 Answers
              4






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              3
              down vote



              accepted
              +50










              For any $n ge 2$, consider the function $displaystyle;Phi_n(x) = sum_{k=1}^infty left(frac{x}{x+k}right)^n$.



              It is easy to see $Phi_n(x)$ is an increasing function over $(0,infty]$.
              For small $x$, it is bounded from above by $x^n zeta(n)$ and hence decreases to $0$ as $x to 0$.
              For large $x$, we can approximate the sum by an integral and $Phi_n(x)$ diverges like $displaystyle;frac{x}{n-1}$ as $x to infty$. By definition, $x_n$ is the unique root for $Phi_n(x_n) = 1$. Let $displaystyle;y_n = frac{x_n}{n}$.



              For any $alpha > 0$, apply AM $ge$ GM to $n$ copies of $1 + frac{alpha}{n}$ and one copy of $1$, we obtain



              $$left(1 + frac{alpha}{n}right)^{n/n+1} > frac1{n+1} left[nleft(1 + frac{alpha}{n}right) + 1 right] = 1 + frac{alpha}{n+1}$$
              The inequality is strict because the $n+1$ numbers are not identical. Taking reciprocal on both sides, we get
              $$left( frac{n}{n + alpha} right)^n ge left(frac{n+1}{n+1 + alpha}right)^{n+1}
              $$



              Replace $alpha$ by $displaystyle;frac{k}{y_n}$ for generic positive integer $k$, we obtain



              $$left( frac{x_n}{x_n + k} right)^n = left( frac{n y_n}{n y_n + k} right)^n > left(frac{(n+1)y_n}{(n+1)y_n + k}right)^{n+1}$$
              Summing over $k$ and using definition of $x_n$, we find



              $$Phi_{n+1}(x_{n+1}) = 1 = Phi_n(x_n) > Phi_{n+1}((n+1)y_n)$$



              Since $Phi_{n+1}$ is increasing, we obtain $x_{n+1} > (n+1)y_n iff y_{n+1} > y_n$.
              This means $y_n$ is an increasing sequence.



              We are going to show $y_n$ is bounded from above by $frac32$
              (see update below for a more elementary and better upper bound).
              For simplicity, let us abberivate $x_n$ and $y_n$ as $x$ and $y$. By their definition, we have



              $$frac{2}{x^n} = sum_{k=0}^infty frac{1}{(x+k)^n}$$



              By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is



              $$begin{align}frac{3}{2x^n} &= int_0^infty frac{dk}{(x+k)^n} +
              i int_0^infty frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2pi t} - 1} dt\
              &=frac{1}{(n-1)x^{n-1}}
              + frac{1}{x^{n-1}}int_0^infty frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2pi x s}-1} ds
              end{align}
              $$

              Multiply both sides by $nx^{n-1}$ and replace $s$ by $s/n$, we obtain



              $$begin{align}frac{3}{2y} - frac{n}{n-1} &=
              i int_0^infty frac{(1 + ifrac{s}{n})^{-n} - (1-ifrac{s}{n})^{-n}}{e^{2pi ys} - 1} ds\
              &= 2int_0^infty frac{sinleft(ntan^{-1}left(frac{s}{n}right)right)}{left(1 + frac{t^2}{n^2}right)^{n/2}} frac{ds}{e^{2pi ys}-1}tag{*1}
              end{align}
              $$

              For the integral on RHS, if we want its integrand to be negative, we need



              $$ntan^{-1}left(frac{s}{n}right) > pi
              implies frac{s}{n} > tanleft(frac{pi}{n}right) implies s > pi$$



              By the time $s$ reaches $pi$, the factor $frac{1}{e^{2pi ys} - 1}$ already drops to very small. Numerically, we know $y_4 > 1$, so for $n ge 4$ and $s ge pi$, we have



              $$frac{1}{e^{2pi ys} - 1} le frac{1}{e^{2pi^2} - 1} approx 2.675 times 10^{-9}$$



              This implies the integral is positive. For $n ge 4$, we can deduce



              $$frac{3}{2y} ge frac{n}{n-1} implies y_n le frac32left(1 - frac1nright) < frac32$$



              Since $y_n$ is increasing and bounded from above by $frac32$, limit
              $y_infty stackrel{def}{=} lim_{ntoinfty} y_n$ exists and $le frac32$.



              For fixed $y > 0$, with help of DCT, one can show the last integral of $(*1)$
              converges.

              This suggests $y_infty$ is a root of following equation near $frac32$



              $$frac{3}{2y} = 1 + 2int_0^infty frac{sin(s)}{e^{2pi ys} - 1} ds$$



              According to DLMF,
              $$int_0^infty e^{-x} frac{sin(ax)}{sinh x} dx = frac{pi}{2}cothleft(frac{pi a}{2}right) - frac1aquadtext{ for }quad a ne 0$$



              We can transform our equation to



              $$frac{3}{2y} = 1 + 2left[frac{1}{4y}cothleft(frac{1}{2y}right) - frac12right]
              iff cothleft(frac{1}{2y}right) = 3$$



              This leads to $displaystyle;y_infty = frac{1}{log 2}$.



              This is consistent with the finding of another answer (currently deleted):




              If $L_infty = lim_{ntoinfty}frac{n}{x_n}$ exists, then $L_infty = log 2$.




              To summarize, the limit $displaystyle;frac{x_n}{n}$ exists and should equal to $displaystyle;frac{1}{log 2}$.





              Update



              It turns out there is a more elementary proof that $y_n$ is bounded from above by the optimal bound $displaystyle;frac{1}{log 2}$.



              Recall for any $alpha > 0$. we have $1 + alpha < e^alpha$. Substitute
              $alpha$ by $frac{k}{n}log 2$ for $n ge 2$ and $k ge 1$, we get



              $$frac{n}{n + klog 2} = frac{1}{1 + frac{k}{n}log 2} > e^{-frac{k}{n}log 2} = 2^{-frac{k}{n}}$$



              This leads to



              $$Phi_nleft(frac{n}{log 2}right)
              = sum_{k=1}^infty left(frac{n}{n + log 2 k}right)^n
              > sum_{k=1}^infty 2^{-k}
              = 1 = Phi_n(x_n)
              $$

              Since $Phi_n(x)$ is increasing, this means
              $displaystyle;frac{n}{log 2} > x_n$ and $y_n$ is bounded from above by $displaystyle;frac{1}{log 2}$.






              share|cite|improve this answer























              • Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
                – mbfkk
                Nov 8 at 11:28










              • @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
                – achille hui
                Nov 8 at 11:44










              • I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
                – mbfkk
                Nov 9 at 8:44















              up vote
              3
              down vote



              accepted
              +50










              For any $n ge 2$, consider the function $displaystyle;Phi_n(x) = sum_{k=1}^infty left(frac{x}{x+k}right)^n$.



              It is easy to see $Phi_n(x)$ is an increasing function over $(0,infty]$.
              For small $x$, it is bounded from above by $x^n zeta(n)$ and hence decreases to $0$ as $x to 0$.
              For large $x$, we can approximate the sum by an integral and $Phi_n(x)$ diverges like $displaystyle;frac{x}{n-1}$ as $x to infty$. By definition, $x_n$ is the unique root for $Phi_n(x_n) = 1$. Let $displaystyle;y_n = frac{x_n}{n}$.



              For any $alpha > 0$, apply AM $ge$ GM to $n$ copies of $1 + frac{alpha}{n}$ and one copy of $1$, we obtain



              $$left(1 + frac{alpha}{n}right)^{n/n+1} > frac1{n+1} left[nleft(1 + frac{alpha}{n}right) + 1 right] = 1 + frac{alpha}{n+1}$$
              The inequality is strict because the $n+1$ numbers are not identical. Taking reciprocal on both sides, we get
              $$left( frac{n}{n + alpha} right)^n ge left(frac{n+1}{n+1 + alpha}right)^{n+1}
              $$



              Replace $alpha$ by $displaystyle;frac{k}{y_n}$ for generic positive integer $k$, we obtain



              $$left( frac{x_n}{x_n + k} right)^n = left( frac{n y_n}{n y_n + k} right)^n > left(frac{(n+1)y_n}{(n+1)y_n + k}right)^{n+1}$$
              Summing over $k$ and using definition of $x_n$, we find



              $$Phi_{n+1}(x_{n+1}) = 1 = Phi_n(x_n) > Phi_{n+1}((n+1)y_n)$$



              Since $Phi_{n+1}$ is increasing, we obtain $x_{n+1} > (n+1)y_n iff y_{n+1} > y_n$.
              This means $y_n$ is an increasing sequence.



              We are going to show $y_n$ is bounded from above by $frac32$
              (see update below for a more elementary and better upper bound).
              For simplicity, let us abberivate $x_n$ and $y_n$ as $x$ and $y$. By their definition, we have



              $$frac{2}{x^n} = sum_{k=0}^infty frac{1}{(x+k)^n}$$



              By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is



              $$begin{align}frac{3}{2x^n} &= int_0^infty frac{dk}{(x+k)^n} +
              i int_0^infty frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2pi t} - 1} dt\
              &=frac{1}{(n-1)x^{n-1}}
              + frac{1}{x^{n-1}}int_0^infty frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2pi x s}-1} ds
              end{align}
              $$

              Multiply both sides by $nx^{n-1}$ and replace $s$ by $s/n$, we obtain



              $$begin{align}frac{3}{2y} - frac{n}{n-1} &=
              i int_0^infty frac{(1 + ifrac{s}{n})^{-n} - (1-ifrac{s}{n})^{-n}}{e^{2pi ys} - 1} ds\
              &= 2int_0^infty frac{sinleft(ntan^{-1}left(frac{s}{n}right)right)}{left(1 + frac{t^2}{n^2}right)^{n/2}} frac{ds}{e^{2pi ys}-1}tag{*1}
              end{align}
              $$

              For the integral on RHS, if we want its integrand to be negative, we need



              $$ntan^{-1}left(frac{s}{n}right) > pi
              implies frac{s}{n} > tanleft(frac{pi}{n}right) implies s > pi$$



              By the time $s$ reaches $pi$, the factor $frac{1}{e^{2pi ys} - 1}$ already drops to very small. Numerically, we know $y_4 > 1$, so for $n ge 4$ and $s ge pi$, we have



              $$frac{1}{e^{2pi ys} - 1} le frac{1}{e^{2pi^2} - 1} approx 2.675 times 10^{-9}$$



              This implies the integral is positive. For $n ge 4$, we can deduce



              $$frac{3}{2y} ge frac{n}{n-1} implies y_n le frac32left(1 - frac1nright) < frac32$$



              Since $y_n$ is increasing and bounded from above by $frac32$, limit
              $y_infty stackrel{def}{=} lim_{ntoinfty} y_n$ exists and $le frac32$.



              For fixed $y > 0$, with help of DCT, one can show the last integral of $(*1)$
              converges.

              This suggests $y_infty$ is a root of following equation near $frac32$



              $$frac{3}{2y} = 1 + 2int_0^infty frac{sin(s)}{e^{2pi ys} - 1} ds$$



              According to DLMF,
              $$int_0^infty e^{-x} frac{sin(ax)}{sinh x} dx = frac{pi}{2}cothleft(frac{pi a}{2}right) - frac1aquadtext{ for }quad a ne 0$$



              We can transform our equation to



              $$frac{3}{2y} = 1 + 2left[frac{1}{4y}cothleft(frac{1}{2y}right) - frac12right]
              iff cothleft(frac{1}{2y}right) = 3$$



              This leads to $displaystyle;y_infty = frac{1}{log 2}$.



              This is consistent with the finding of another answer (currently deleted):




              If $L_infty = lim_{ntoinfty}frac{n}{x_n}$ exists, then $L_infty = log 2$.




              To summarize, the limit $displaystyle;frac{x_n}{n}$ exists and should equal to $displaystyle;frac{1}{log 2}$.





              Update



              It turns out there is a more elementary proof that $y_n$ is bounded from above by the optimal bound $displaystyle;frac{1}{log 2}$.



              Recall for any $alpha > 0$. we have $1 + alpha < e^alpha$. Substitute
              $alpha$ by $frac{k}{n}log 2$ for $n ge 2$ and $k ge 1$, we get



              $$frac{n}{n + klog 2} = frac{1}{1 + frac{k}{n}log 2} > e^{-frac{k}{n}log 2} = 2^{-frac{k}{n}}$$



              This leads to



              $$Phi_nleft(frac{n}{log 2}right)
              = sum_{k=1}^infty left(frac{n}{n + log 2 k}right)^n
              > sum_{k=1}^infty 2^{-k}
              = 1 = Phi_n(x_n)
              $$

              Since $Phi_n(x)$ is increasing, this means
              $displaystyle;frac{n}{log 2} > x_n$ and $y_n$ is bounded from above by $displaystyle;frac{1}{log 2}$.






              share|cite|improve this answer























              • Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
                – mbfkk
                Nov 8 at 11:28










              • @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
                – achille hui
                Nov 8 at 11:44










              • I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
                – mbfkk
                Nov 9 at 8:44













              up vote
              3
              down vote



              accepted
              +50







              up vote
              3
              down vote



              accepted
              +50




              +50




              For any $n ge 2$, consider the function $displaystyle;Phi_n(x) = sum_{k=1}^infty left(frac{x}{x+k}right)^n$.



              It is easy to see $Phi_n(x)$ is an increasing function over $(0,infty]$.
              For small $x$, it is bounded from above by $x^n zeta(n)$ and hence decreases to $0$ as $x to 0$.
              For large $x$, we can approximate the sum by an integral and $Phi_n(x)$ diverges like $displaystyle;frac{x}{n-1}$ as $x to infty$. By definition, $x_n$ is the unique root for $Phi_n(x_n) = 1$. Let $displaystyle;y_n = frac{x_n}{n}$.



              For any $alpha > 0$, apply AM $ge$ GM to $n$ copies of $1 + frac{alpha}{n}$ and one copy of $1$, we obtain



              $$left(1 + frac{alpha}{n}right)^{n/n+1} > frac1{n+1} left[nleft(1 + frac{alpha}{n}right) + 1 right] = 1 + frac{alpha}{n+1}$$
              The inequality is strict because the $n+1$ numbers are not identical. Taking reciprocal on both sides, we get
              $$left( frac{n}{n + alpha} right)^n ge left(frac{n+1}{n+1 + alpha}right)^{n+1}
              $$



              Replace $alpha$ by $displaystyle;frac{k}{y_n}$ for generic positive integer $k$, we obtain



              $$left( frac{x_n}{x_n + k} right)^n = left( frac{n y_n}{n y_n + k} right)^n > left(frac{(n+1)y_n}{(n+1)y_n + k}right)^{n+1}$$
              Summing over $k$ and using definition of $x_n$, we find



              $$Phi_{n+1}(x_{n+1}) = 1 = Phi_n(x_n) > Phi_{n+1}((n+1)y_n)$$



              Since $Phi_{n+1}$ is increasing, we obtain $x_{n+1} > (n+1)y_n iff y_{n+1} > y_n$.
              This means $y_n$ is an increasing sequence.



              We are going to show $y_n$ is bounded from above by $frac32$
              (see update below for a more elementary and better upper bound).
              For simplicity, let us abberivate $x_n$ and $y_n$ as $x$ and $y$. By their definition, we have



              $$frac{2}{x^n} = sum_{k=0}^infty frac{1}{(x+k)^n}$$



              By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is



              $$begin{align}frac{3}{2x^n} &= int_0^infty frac{dk}{(x+k)^n} +
              i int_0^infty frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2pi t} - 1} dt\
              &=frac{1}{(n-1)x^{n-1}}
              + frac{1}{x^{n-1}}int_0^infty frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2pi x s}-1} ds
              end{align}
              $$

              Multiply both sides by $nx^{n-1}$ and replace $s$ by $s/n$, we obtain



              $$begin{align}frac{3}{2y} - frac{n}{n-1} &=
              i int_0^infty frac{(1 + ifrac{s}{n})^{-n} - (1-ifrac{s}{n})^{-n}}{e^{2pi ys} - 1} ds\
              &= 2int_0^infty frac{sinleft(ntan^{-1}left(frac{s}{n}right)right)}{left(1 + frac{t^2}{n^2}right)^{n/2}} frac{ds}{e^{2pi ys}-1}tag{*1}
              end{align}
              $$

              For the integral on RHS, if we want its integrand to be negative, we need



              $$ntan^{-1}left(frac{s}{n}right) > pi
              implies frac{s}{n} > tanleft(frac{pi}{n}right) implies s > pi$$



              By the time $s$ reaches $pi$, the factor $frac{1}{e^{2pi ys} - 1}$ already drops to very small. Numerically, we know $y_4 > 1$, so for $n ge 4$ and $s ge pi$, we have



              $$frac{1}{e^{2pi ys} - 1} le frac{1}{e^{2pi^2} - 1} approx 2.675 times 10^{-9}$$



              This implies the integral is positive. For $n ge 4$, we can deduce



              $$frac{3}{2y} ge frac{n}{n-1} implies y_n le frac32left(1 - frac1nright) < frac32$$



              Since $y_n$ is increasing and bounded from above by $frac32$, limit
              $y_infty stackrel{def}{=} lim_{ntoinfty} y_n$ exists and $le frac32$.



              For fixed $y > 0$, with help of DCT, one can show the last integral of $(*1)$
              converges.

              This suggests $y_infty$ is a root of following equation near $frac32$



              $$frac{3}{2y} = 1 + 2int_0^infty frac{sin(s)}{e^{2pi ys} - 1} ds$$



              According to DLMF,
              $$int_0^infty e^{-x} frac{sin(ax)}{sinh x} dx = frac{pi}{2}cothleft(frac{pi a}{2}right) - frac1aquadtext{ for }quad a ne 0$$



              We can transform our equation to



              $$frac{3}{2y} = 1 + 2left[frac{1}{4y}cothleft(frac{1}{2y}right) - frac12right]
              iff cothleft(frac{1}{2y}right) = 3$$



              This leads to $displaystyle;y_infty = frac{1}{log 2}$.



              This is consistent with the finding of another answer (currently deleted):




              If $L_infty = lim_{ntoinfty}frac{n}{x_n}$ exists, then $L_infty = log 2$.




              To summarize, the limit $displaystyle;frac{x_n}{n}$ exists and should equal to $displaystyle;frac{1}{log 2}$.





              Update



              It turns out there is a more elementary proof that $y_n$ is bounded from above by the optimal bound $displaystyle;frac{1}{log 2}$.



              Recall for any $alpha > 0$. we have $1 + alpha < e^alpha$. Substitute
              $alpha$ by $frac{k}{n}log 2$ for $n ge 2$ and $k ge 1$, we get



              $$frac{n}{n + klog 2} = frac{1}{1 + frac{k}{n}log 2} > e^{-frac{k}{n}log 2} = 2^{-frac{k}{n}}$$



              This leads to



              $$Phi_nleft(frac{n}{log 2}right)
              = sum_{k=1}^infty left(frac{n}{n + log 2 k}right)^n
              > sum_{k=1}^infty 2^{-k}
              = 1 = Phi_n(x_n)
              $$

              Since $Phi_n(x)$ is increasing, this means
              $displaystyle;frac{n}{log 2} > x_n$ and $y_n$ is bounded from above by $displaystyle;frac{1}{log 2}$.






              share|cite|improve this answer














              For any $n ge 2$, consider the function $displaystyle;Phi_n(x) = sum_{k=1}^infty left(frac{x}{x+k}right)^n$.



              It is easy to see $Phi_n(x)$ is an increasing function over $(0,infty]$.
              For small $x$, it is bounded from above by $x^n zeta(n)$ and hence decreases to $0$ as $x to 0$.
              For large $x$, we can approximate the sum by an integral and $Phi_n(x)$ diverges like $displaystyle;frac{x}{n-1}$ as $x to infty$. By definition, $x_n$ is the unique root for $Phi_n(x_n) = 1$. Let $displaystyle;y_n = frac{x_n}{n}$.



              For any $alpha > 0$, apply AM $ge$ GM to $n$ copies of $1 + frac{alpha}{n}$ and one copy of $1$, we obtain



              $$left(1 + frac{alpha}{n}right)^{n/n+1} > frac1{n+1} left[nleft(1 + frac{alpha}{n}right) + 1 right] = 1 + frac{alpha}{n+1}$$
              The inequality is strict because the $n+1$ numbers are not identical. Taking reciprocal on both sides, we get
              $$left( frac{n}{n + alpha} right)^n ge left(frac{n+1}{n+1 + alpha}right)^{n+1}
              $$



              Replace $alpha$ by $displaystyle;frac{k}{y_n}$ for generic positive integer $k$, we obtain



              $$left( frac{x_n}{x_n + k} right)^n = left( frac{n y_n}{n y_n + k} right)^n > left(frac{(n+1)y_n}{(n+1)y_n + k}right)^{n+1}$$
              Summing over $k$ and using definition of $x_n$, we find



              $$Phi_{n+1}(x_{n+1}) = 1 = Phi_n(x_n) > Phi_{n+1}((n+1)y_n)$$



              Since $Phi_{n+1}$ is increasing, we obtain $x_{n+1} > (n+1)y_n iff y_{n+1} > y_n$.
              This means $y_n$ is an increasing sequence.



              We are going to show $y_n$ is bounded from above by $frac32$
              (see update below for a more elementary and better upper bound).
              For simplicity, let us abberivate $x_n$ and $y_n$ as $x$ and $y$. By their definition, we have



              $$frac{2}{x^n} = sum_{k=0}^infty frac{1}{(x+k)^n}$$



              By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is



              $$begin{align}frac{3}{2x^n} &= int_0^infty frac{dk}{(x+k)^n} +
              i int_0^infty frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2pi t} - 1} dt\
              &=frac{1}{(n-1)x^{n-1}}
              + frac{1}{x^{n-1}}int_0^infty frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2pi x s}-1} ds
              end{align}
              $$

              Multiply both sides by $nx^{n-1}$ and replace $s$ by $s/n$, we obtain



              $$begin{align}frac{3}{2y} - frac{n}{n-1} &=
              i int_0^infty frac{(1 + ifrac{s}{n})^{-n} - (1-ifrac{s}{n})^{-n}}{e^{2pi ys} - 1} ds\
              &= 2int_0^infty frac{sinleft(ntan^{-1}left(frac{s}{n}right)right)}{left(1 + frac{t^2}{n^2}right)^{n/2}} frac{ds}{e^{2pi ys}-1}tag{*1}
              end{align}
              $$

              For the integral on RHS, if we want its integrand to be negative, we need



              $$ntan^{-1}left(frac{s}{n}right) > pi
              implies frac{s}{n} > tanleft(frac{pi}{n}right) implies s > pi$$



              By the time $s$ reaches $pi$, the factor $frac{1}{e^{2pi ys} - 1}$ already drops to very small. Numerically, we know $y_4 > 1$, so for $n ge 4$ and $s ge pi$, we have



              $$frac{1}{e^{2pi ys} - 1} le frac{1}{e^{2pi^2} - 1} approx 2.675 times 10^{-9}$$



              This implies the integral is positive. For $n ge 4$, we can deduce



              $$frac{3}{2y} ge frac{n}{n-1} implies y_n le frac32left(1 - frac1nright) < frac32$$



              Since $y_n$ is increasing and bounded from above by $frac32$, limit
              $y_infty stackrel{def}{=} lim_{ntoinfty} y_n$ exists and $le frac32$.



              For fixed $y > 0$, with help of DCT, one can show the last integral of $(*1)$
              converges.

              This suggests $y_infty$ is a root of following equation near $frac32$



              $$frac{3}{2y} = 1 + 2int_0^infty frac{sin(s)}{e^{2pi ys} - 1} ds$$



              According to DLMF,
              $$int_0^infty e^{-x} frac{sin(ax)}{sinh x} dx = frac{pi}{2}cothleft(frac{pi a}{2}right) - frac1aquadtext{ for }quad a ne 0$$



              We can transform our equation to



              $$frac{3}{2y} = 1 + 2left[frac{1}{4y}cothleft(frac{1}{2y}right) - frac12right]
              iff cothleft(frac{1}{2y}right) = 3$$



              This leads to $displaystyle;y_infty = frac{1}{log 2}$.



              This is consistent with the finding of another answer (currently deleted):




              If $L_infty = lim_{ntoinfty}frac{n}{x_n}$ exists, then $L_infty = log 2$.




              To summarize, the limit $displaystyle;frac{x_n}{n}$ exists and should equal to $displaystyle;frac{1}{log 2}$.





              Update



              It turns out there is a more elementary proof that $y_n$ is bounded from above by the optimal bound $displaystyle;frac{1}{log 2}$.



              Recall for any $alpha > 0$. we have $1 + alpha < e^alpha$. Substitute
              $alpha$ by $frac{k}{n}log 2$ for $n ge 2$ and $k ge 1$, we get



              $$frac{n}{n + klog 2} = frac{1}{1 + frac{k}{n}log 2} > e^{-frac{k}{n}log 2} = 2^{-frac{k}{n}}$$



              This leads to



              $$Phi_nleft(frac{n}{log 2}right)
              = sum_{k=1}^infty left(frac{n}{n + log 2 k}right)^n
              > sum_{k=1}^infty 2^{-k}
              = 1 = Phi_n(x_n)
              $$

              Since $Phi_n(x)$ is increasing, this means
              $displaystyle;frac{n}{log 2} > x_n$ and $y_n$ is bounded from above by $displaystyle;frac{1}{log 2}$.







              share|cite|improve this answer














              share|cite|improve this answer



              share|cite|improve this answer








              edited Nov 7 at 19:55

























              answered Nov 7 at 17:08









              achille hui

              93.4k5127251




              93.4k5127251












              • Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
                – mbfkk
                Nov 8 at 11:28










              • @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
                – achille hui
                Nov 8 at 11:44










              • I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
                – mbfkk
                Nov 9 at 8:44


















              • Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
                – mbfkk
                Nov 8 at 11:28










              • @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
                – achille hui
                Nov 8 at 11:44










              • I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
                – mbfkk
                Nov 9 at 8:44
















              Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
              – mbfkk
              Nov 8 at 11:28




              Nice done! Thanks for your reply.By the way,how can we prove that the limit is $frac1{log 2}$?,i.e. it's no less than $frac1{log 2}$.
              – mbfkk
              Nov 8 at 11:28












              @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
              – achille hui
              Nov 8 at 11:44




              @mbfkk I don't have a 'rigorous' proof that $y_infty = frac{1}{log 2}$, otherwise I would include that in my answer. I've already tried a few tricks but none of them work.
              – achille hui
              Nov 8 at 11:44












              I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
              – mbfkk
              Nov 9 at 8:44




              I have got a proof that $y_infty=frac{1}{ln 2}$,see the third floor.
              – mbfkk
              Nov 9 at 8:44










              up vote
              2
              down vote













              Consider the functions
              $$f_n(x):=sum_{k=1}^inftyleft(frac{x}{x+k}right)^n.$$
              (The series should converge for every fixed $xgeq 0$ and $ngeq 2$.)
              Then the values $x_n$ are the solutions of
              $$f_n(x)=1.$$
              We have that $f_n(0)=0$ and because of
              $$f_n'(x)=sum_{k=1}^{infty}nleft(frac{x}{x+k}right)^{n-1}frac{k}{(x+k)^2},$$
              we have $f'_n(x)>0$ for $x>0$.
              Moreover
              $$f_n(3n)=sum_{k=1}^{infty}left(frac{3n}{3n+k}right)^ngeq3left(frac{3n}{3n+3}right)^n=3left(1+frac{1}{n}right)^{-n}.$$
              Since $lim_{ntoinfty}(1+frac{1}{n})^n=e$ we have $$lim_{ntoinfty}f_n(3n)geqfrac{3}{e}>1$$ and there exists $Ninmathbb N$, such that
              $$f_n(3n)>1$$
              for all $ngeq N$.



              Thus, for large enough $n$ we have $x_nin(0,3n)$ and
              $$0leqlim_{ntoinfty}frac{x_n}{n}leq 3$$






              share|cite|improve this answer



























                up vote
                2
                down vote













                Consider the functions
                $$f_n(x):=sum_{k=1}^inftyleft(frac{x}{x+k}right)^n.$$
                (The series should converge for every fixed $xgeq 0$ and $ngeq 2$.)
                Then the values $x_n$ are the solutions of
                $$f_n(x)=1.$$
                We have that $f_n(0)=0$ and because of
                $$f_n'(x)=sum_{k=1}^{infty}nleft(frac{x}{x+k}right)^{n-1}frac{k}{(x+k)^2},$$
                we have $f'_n(x)>0$ for $x>0$.
                Moreover
                $$f_n(3n)=sum_{k=1}^{infty}left(frac{3n}{3n+k}right)^ngeq3left(frac{3n}{3n+3}right)^n=3left(1+frac{1}{n}right)^{-n}.$$
                Since $lim_{ntoinfty}(1+frac{1}{n})^n=e$ we have $$lim_{ntoinfty}f_n(3n)geqfrac{3}{e}>1$$ and there exists $Ninmathbb N$, such that
                $$f_n(3n)>1$$
                for all $ngeq N$.



                Thus, for large enough $n$ we have $x_nin(0,3n)$ and
                $$0leqlim_{ntoinfty}frac{x_n}{n}leq 3$$






                share|cite|improve this answer

























                  up vote
                  2
                  down vote










                  up vote
                  2
                  down vote









                  Consider the functions
                  $$f_n(x):=sum_{k=1}^inftyleft(frac{x}{x+k}right)^n.$$
                  (The series should converge for every fixed $xgeq 0$ and $ngeq 2$.)
                  Then the values $x_n$ are the solutions of
                  $$f_n(x)=1.$$
                  We have that $f_n(0)=0$ and because of
                  $$f_n'(x)=sum_{k=1}^{infty}nleft(frac{x}{x+k}right)^{n-1}frac{k}{(x+k)^2},$$
                  we have $f'_n(x)>0$ for $x>0$.
                  Moreover
                  $$f_n(3n)=sum_{k=1}^{infty}left(frac{3n}{3n+k}right)^ngeq3left(frac{3n}{3n+3}right)^n=3left(1+frac{1}{n}right)^{-n}.$$
                  Since $lim_{ntoinfty}(1+frac{1}{n})^n=e$ we have $$lim_{ntoinfty}f_n(3n)geqfrac{3}{e}>1$$ and there exists $Ninmathbb N$, such that
                  $$f_n(3n)>1$$
                  for all $ngeq N$.



                  Thus, for large enough $n$ we have $x_nin(0,3n)$ and
                  $$0leqlim_{ntoinfty}frac{x_n}{n}leq 3$$






                  share|cite|improve this answer














                  Consider the functions
                  $$f_n(x):=sum_{k=1}^inftyleft(frac{x}{x+k}right)^n.$$
                  (The series should converge for every fixed $xgeq 0$ and $ngeq 2$.)
                  Then the values $x_n$ are the solutions of
                  $$f_n(x)=1.$$
                  We have that $f_n(0)=0$ and because of
                  $$f_n'(x)=sum_{k=1}^{infty}nleft(frac{x}{x+k}right)^{n-1}frac{k}{(x+k)^2},$$
                  we have $f'_n(x)>0$ for $x>0$.
                  Moreover
                  $$f_n(3n)=sum_{k=1}^{infty}left(frac{3n}{3n+k}right)^ngeq3left(frac{3n}{3n+3}right)^n=3left(1+frac{1}{n}right)^{-n}.$$
                  Since $lim_{ntoinfty}(1+frac{1}{n})^n=e$ we have $$lim_{ntoinfty}f_n(3n)geqfrac{3}{e}>1$$ and there exists $Ninmathbb N$, such that
                  $$f_n(3n)>1$$
                  for all $ngeq N$.



                  Thus, for large enough $n$ we have $x_nin(0,3n)$ and
                  $$0leqlim_{ntoinfty}frac{x_n}{n}leq 3$$







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Nov 7 at 18:42

























                  answered Nov 7 at 11:58









                  weee

                  4608




                  4608






















                      up vote
                      2
                      down vote













                      Below is my thought of proving $limlimits_{nto infty}frac{x_n}{n}=frac{1}{ln 2}$.



                      For any $lambda >0$,
                      begin{align*}
                      Phi_n(lambda n)=sum_{k=1}^infty left( frac{lambda n}{lambda n+k}right)^n
                      end{align*}

                      We denote $a_{n,k}=left( frac{lambda n}{lambda n+k}right)^n$,it's easy to verify that $a_{n,k}$ is decreasing for $n$,and
                      begin{align*}
                      lim_{nto infty}a_{n,k}=e^{-k/lambda}triangleq b_k
                      end{align*}

                      We notice that $sum_{k=1}^infty b_k=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}$,$a_{n,k}<a_{2,k}$,$ngeq 2$,$sum a_{2,k}$is convergent.Meanwhile ,we can verify the following proposition(A similar to Lebesgue's dominated convergent theorem)



                      Suppose${a_{n,k}}$is a positive binary index sequence,and for all $kin mathbb{N}_+$we have
                      $a_{n,k}to b_k$,$ntoinfty$,besides $|a_{n,k}|<a_k$, $sum_{k=1}^infty a_k$ is convergent.Then
                      begin{align*}
                      lim_{nto infty}sum_{k=1}^infty a_{n,k}=sum_{k=1}^infty b_k
                      end{align*}



                      So thanks to the above proposition can see
                      begin{align*}
                      lim_{nto infty}Phi_n(lambda n)=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}
                      end{align*}



                      Specially,we take $lambda=frac{1}{ln 2}$,then $lim_{nto infty}Phi_nleft(frac{ n}{ln 2}right)=1=Phi_n(x_n)$.Thus for all $s>frac{1}{ln 2}$,since
                      begin{align*}
                      lim_{nto infty }Phi_n(s n)=frac{1}{e^{1/s}-1}>1=lim_{nto infty}Phi_n(x_n)
                      end{align*}

                      we see that there exists $N$,such that for all$ n>N$,
                      begin{align*}
                      Phi_n(s n)>Phi_n(x_n)Rightarrow sn>x_n,forall n>N
                      end{align*}

                      This implies $A=limlimits_{nto infty }y_nleqslant s$,thus $Aleqslant frac{1}{ln 2}$.Similarly we can prove $Ageqslant frac{1}{ln 2}$,and finally we get $A=frac{1}{ln 2}$.






                      share|cite|improve this answer























                      • (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                        – achille hui
                        Nov 9 at 11:00

















                      up vote
                      2
                      down vote













                      Below is my thought of proving $limlimits_{nto infty}frac{x_n}{n}=frac{1}{ln 2}$.



                      For any $lambda >0$,
                      begin{align*}
                      Phi_n(lambda n)=sum_{k=1}^infty left( frac{lambda n}{lambda n+k}right)^n
                      end{align*}

                      We denote $a_{n,k}=left( frac{lambda n}{lambda n+k}right)^n$,it's easy to verify that $a_{n,k}$ is decreasing for $n$,and
                      begin{align*}
                      lim_{nto infty}a_{n,k}=e^{-k/lambda}triangleq b_k
                      end{align*}

                      We notice that $sum_{k=1}^infty b_k=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}$,$a_{n,k}<a_{2,k}$,$ngeq 2$,$sum a_{2,k}$is convergent.Meanwhile ,we can verify the following proposition(A similar to Lebesgue's dominated convergent theorem)



                      Suppose${a_{n,k}}$is a positive binary index sequence,and for all $kin mathbb{N}_+$we have
                      $a_{n,k}to b_k$,$ntoinfty$,besides $|a_{n,k}|<a_k$, $sum_{k=1}^infty a_k$ is convergent.Then
                      begin{align*}
                      lim_{nto infty}sum_{k=1}^infty a_{n,k}=sum_{k=1}^infty b_k
                      end{align*}



                      So thanks to the above proposition can see
                      begin{align*}
                      lim_{nto infty}Phi_n(lambda n)=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}
                      end{align*}



                      Specially,we take $lambda=frac{1}{ln 2}$,then $lim_{nto infty}Phi_nleft(frac{ n}{ln 2}right)=1=Phi_n(x_n)$.Thus for all $s>frac{1}{ln 2}$,since
                      begin{align*}
                      lim_{nto infty }Phi_n(s n)=frac{1}{e^{1/s}-1}>1=lim_{nto infty}Phi_n(x_n)
                      end{align*}

                      we see that there exists $N$,such that for all$ n>N$,
                      begin{align*}
                      Phi_n(s n)>Phi_n(x_n)Rightarrow sn>x_n,forall n>N
                      end{align*}

                      This implies $A=limlimits_{nto infty }y_nleqslant s$,thus $Aleqslant frac{1}{ln 2}$.Similarly we can prove $Ageqslant frac{1}{ln 2}$,and finally we get $A=frac{1}{ln 2}$.






                      share|cite|improve this answer























                      • (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                        – achille hui
                        Nov 9 at 11:00















                      up vote
                      2
                      down vote










                      up vote
                      2
                      down vote









                      Below is my thought of proving $limlimits_{nto infty}frac{x_n}{n}=frac{1}{ln 2}$.



                      For any $lambda >0$,
                      begin{align*}
                      Phi_n(lambda n)=sum_{k=1}^infty left( frac{lambda n}{lambda n+k}right)^n
                      end{align*}

                      We denote $a_{n,k}=left( frac{lambda n}{lambda n+k}right)^n$,it's easy to verify that $a_{n,k}$ is decreasing for $n$,and
                      begin{align*}
                      lim_{nto infty}a_{n,k}=e^{-k/lambda}triangleq b_k
                      end{align*}

                      We notice that $sum_{k=1}^infty b_k=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}$,$a_{n,k}<a_{2,k}$,$ngeq 2$,$sum a_{2,k}$is convergent.Meanwhile ,we can verify the following proposition(A similar to Lebesgue's dominated convergent theorem)



                      Suppose${a_{n,k}}$is a positive binary index sequence,and for all $kin mathbb{N}_+$we have
                      $a_{n,k}to b_k$,$ntoinfty$,besides $|a_{n,k}|<a_k$, $sum_{k=1}^infty a_k$ is convergent.Then
                      begin{align*}
                      lim_{nto infty}sum_{k=1}^infty a_{n,k}=sum_{k=1}^infty b_k
                      end{align*}



                      So thanks to the above proposition can see
                      begin{align*}
                      lim_{nto infty}Phi_n(lambda n)=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}
                      end{align*}



                      Specially,we take $lambda=frac{1}{ln 2}$,then $lim_{nto infty}Phi_nleft(frac{ n}{ln 2}right)=1=Phi_n(x_n)$.Thus for all $s>frac{1}{ln 2}$,since
                      begin{align*}
                      lim_{nto infty }Phi_n(s n)=frac{1}{e^{1/s}-1}>1=lim_{nto infty}Phi_n(x_n)
                      end{align*}

                      we see that there exists $N$,such that for all$ n>N$,
                      begin{align*}
                      Phi_n(s n)>Phi_n(x_n)Rightarrow sn>x_n,forall n>N
                      end{align*}

                      This implies $A=limlimits_{nto infty }y_nleqslant s$,thus $Aleqslant frac{1}{ln 2}$.Similarly we can prove $Ageqslant frac{1}{ln 2}$,and finally we get $A=frac{1}{ln 2}$.






                      share|cite|improve this answer














                      Below is my thought of proving $limlimits_{nto infty}frac{x_n}{n}=frac{1}{ln 2}$.



                      For any $lambda >0$,
                      begin{align*}
                      Phi_n(lambda n)=sum_{k=1}^infty left( frac{lambda n}{lambda n+k}right)^n
                      end{align*}

                      We denote $a_{n,k}=left( frac{lambda n}{lambda n+k}right)^n$,it's easy to verify that $a_{n,k}$ is decreasing for $n$,and
                      begin{align*}
                      lim_{nto infty}a_{n,k}=e^{-k/lambda}triangleq b_k
                      end{align*}

                      We notice that $sum_{k=1}^infty b_k=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}$,$a_{n,k}<a_{2,k}$,$ngeq 2$,$sum a_{2,k}$is convergent.Meanwhile ,we can verify the following proposition(A similar to Lebesgue's dominated convergent theorem)



                      Suppose${a_{n,k}}$is a positive binary index sequence,and for all $kin mathbb{N}_+$we have
                      $a_{n,k}to b_k$,$ntoinfty$,besides $|a_{n,k}|<a_k$, $sum_{k=1}^infty a_k$ is convergent.Then
                      begin{align*}
                      lim_{nto infty}sum_{k=1}^infty a_{n,k}=sum_{k=1}^infty b_k
                      end{align*}



                      So thanks to the above proposition can see
                      begin{align*}
                      lim_{nto infty}Phi_n(lambda n)=sum_{k=1}^infty e^{-k/lambda}=frac{1}{e^{1/lambda}-1}
                      end{align*}



                      Specially,we take $lambda=frac{1}{ln 2}$,then $lim_{nto infty}Phi_nleft(frac{ n}{ln 2}right)=1=Phi_n(x_n)$.Thus for all $s>frac{1}{ln 2}$,since
                      begin{align*}
                      lim_{nto infty }Phi_n(s n)=frac{1}{e^{1/s}-1}>1=lim_{nto infty}Phi_n(x_n)
                      end{align*}

                      we see that there exists $N$,such that for all$ n>N$,
                      begin{align*}
                      Phi_n(s n)>Phi_n(x_n)Rightarrow sn>x_n,forall n>N
                      end{align*}

                      This implies $A=limlimits_{nto infty }y_nleqslant s$,thus $Aleqslant frac{1}{ln 2}$.Similarly we can prove $Ageqslant frac{1}{ln 2}$,and finally we get $A=frac{1}{ln 2}$.







                      share|cite|improve this answer














                      share|cite|improve this answer



                      share|cite|improve this answer








                      edited Nov 9 at 9:04

























                      answered Nov 9 at 8:42









                      mbfkk

                      331113




                      331113












                      • (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                        – achille hui
                        Nov 9 at 11:00




















                      • (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                        – achille hui
                        Nov 9 at 11:00


















                      (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                      – achille hui
                      Nov 9 at 11:00






                      (+1) good job, this settles the limit $A$ is $frac{1}{log 2}$. In fact, we no longer need to assume $A$ exists to get its value. For any $s > frac{1}{log 2}$, $y_n le s$ for sufficiently large $n$ implies $limsuplimits_{ntoinfty} y_n le s$. This in turn implies $limsuplimits_n y_n le inf s = frac{1}{log 2}$. Similarly, we have $frac{1}{log 2} le liminflimits_{ntoinfty} y_n$. Sim limsup = liminf, limit exists and equal to $frac{1}{log 2}$.
                      – achille hui
                      Nov 9 at 11:00












                      up vote
                      0
                      down vote













                      We can rewrite $$x^{-n} = sum_{k=1}^infty (x+k)^{-n}$$



                      as



                      $$1= sum_{k=1}^infty e^{- nln (1+ k/x_n)}.$$



                      Now



                      $ln (1+k/x_n) le k/x_n$, therefore



                      $$1 le sum_{k=1}^infty e^{-frac{n}{x_n}k} = frac{1}{e^{n/x_n}-1}.$$



                      From this it follows that



                      $$ (*) quad n /x_n ge ln 2.$$



                      Suppose now that $limsup_{ntoinfty} n/x_n=M>c$. Then for all $n$ large, we have $n/x_n>c$ and



                      begin{align*} 1 &= sum_{k=1}^infty e^{-n ln (1+frac{k}{n} times frac{n}{x_n})}\
                      & le sum_{k=1}^infty e^{-n ln (1+ frac{k}{n} c)}\
                      & = sum_{k=1}^infty (1+frac{k}{n}c)^{-n} \
                      & to sum_{k=1}^infty e^{-kc}=frac{1}{e^c-1}.
                      end{align*}

                      by dominated convergence (note: $(1+frac{k}{n}c)^{-n} le (1+frac{kc}{2})^{-2}$).

                      Thus, $e^c-1 le 1$, or $c le ln 2$. It follows that



                      $$(**) quad limsup n/x_n le ln 2.$$



                      Now $(*)$ and $(**)$ give



                      $$lim_{ntoinfty} frac{x_n}{n} = sup_{n} frac{x_n}{n} = frac{1}{ln 2}.$$






                      share|cite|improve this answer

























                        up vote
                        0
                        down vote













                        We can rewrite $$x^{-n} = sum_{k=1}^infty (x+k)^{-n}$$



                        as



                        $$1= sum_{k=1}^infty e^{- nln (1+ k/x_n)}.$$



                        Now



                        $ln (1+k/x_n) le k/x_n$, therefore



                        $$1 le sum_{k=1}^infty e^{-frac{n}{x_n}k} = frac{1}{e^{n/x_n}-1}.$$



                        From this it follows that



                        $$ (*) quad n /x_n ge ln 2.$$



                        Suppose now that $limsup_{ntoinfty} n/x_n=M>c$. Then for all $n$ large, we have $n/x_n>c$ and



                        begin{align*} 1 &= sum_{k=1}^infty e^{-n ln (1+frac{k}{n} times frac{n}{x_n})}\
                        & le sum_{k=1}^infty e^{-n ln (1+ frac{k}{n} c)}\
                        & = sum_{k=1}^infty (1+frac{k}{n}c)^{-n} \
                        & to sum_{k=1}^infty e^{-kc}=frac{1}{e^c-1}.
                        end{align*}

                        by dominated convergence (note: $(1+frac{k}{n}c)^{-n} le (1+frac{kc}{2})^{-2}$).

                        Thus, $e^c-1 le 1$, or $c le ln 2$. It follows that



                        $$(**) quad limsup n/x_n le ln 2.$$



                        Now $(*)$ and $(**)$ give



                        $$lim_{ntoinfty} frac{x_n}{n} = sup_{n} frac{x_n}{n} = frac{1}{ln 2}.$$






                        share|cite|improve this answer























                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          We can rewrite $$x^{-n} = sum_{k=1}^infty (x+k)^{-n}$$



                          as



                          $$1= sum_{k=1}^infty e^{- nln (1+ k/x_n)}.$$



                          Now



                          $ln (1+k/x_n) le k/x_n$, therefore



                          $$1 le sum_{k=1}^infty e^{-frac{n}{x_n}k} = frac{1}{e^{n/x_n}-1}.$$



                          From this it follows that



                          $$ (*) quad n /x_n ge ln 2.$$



                          Suppose now that $limsup_{ntoinfty} n/x_n=M>c$. Then for all $n$ large, we have $n/x_n>c$ and



                          begin{align*} 1 &= sum_{k=1}^infty e^{-n ln (1+frac{k}{n} times frac{n}{x_n})}\
                          & le sum_{k=1}^infty e^{-n ln (1+ frac{k}{n} c)}\
                          & = sum_{k=1}^infty (1+frac{k}{n}c)^{-n} \
                          & to sum_{k=1}^infty e^{-kc}=frac{1}{e^c-1}.
                          end{align*}

                          by dominated convergence (note: $(1+frac{k}{n}c)^{-n} le (1+frac{kc}{2})^{-2}$).

                          Thus, $e^c-1 le 1$, or $c le ln 2$. It follows that



                          $$(**) quad limsup n/x_n le ln 2.$$



                          Now $(*)$ and $(**)$ give



                          $$lim_{ntoinfty} frac{x_n}{n} = sup_{n} frac{x_n}{n} = frac{1}{ln 2}.$$






                          share|cite|improve this answer












                          We can rewrite $$x^{-n} = sum_{k=1}^infty (x+k)^{-n}$$



                          as



                          $$1= sum_{k=1}^infty e^{- nln (1+ k/x_n)}.$$



                          Now



                          $ln (1+k/x_n) le k/x_n$, therefore



                          $$1 le sum_{k=1}^infty e^{-frac{n}{x_n}k} = frac{1}{e^{n/x_n}-1}.$$



                          From this it follows that



                          $$ (*) quad n /x_n ge ln 2.$$



                          Suppose now that $limsup_{ntoinfty} n/x_n=M>c$. Then for all $n$ large, we have $n/x_n>c$ and



                          begin{align*} 1 &= sum_{k=1}^infty e^{-n ln (1+frac{k}{n} times frac{n}{x_n})}\
                          & le sum_{k=1}^infty e^{-n ln (1+ frac{k}{n} c)}\
                          & = sum_{k=1}^infty (1+frac{k}{n}c)^{-n} \
                          & to sum_{k=1}^infty e^{-kc}=frac{1}{e^c-1}.
                          end{align*}

                          by dominated convergence (note: $(1+frac{k}{n}c)^{-n} le (1+frac{kc}{2})^{-2}$).

                          Thus, $e^c-1 le 1$, or $c le ln 2$. It follows that



                          $$(**) quad limsup n/x_n le ln 2.$$



                          Now $(*)$ and $(**)$ give



                          $$lim_{ntoinfty} frac{x_n}{n} = sup_{n} frac{x_n}{n} = frac{1}{ln 2}.$$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Nov 14 at 2:40









                          Fnacool

                          4,891511




                          4,891511






























                               

                              draft saved


                              draft discarded



















































                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2982897%2fhow-to-show-the-existence-of-the-limit-lim-n-to-infty-fracx-nn-if-x-n%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Probability when a professor distributes a quiz and homework assignment to a class of n students.

                              Aardman Animations

                              Are they similar matrix