Why is $sin{x} = x -frac{x^3}{3!} + frac{x^5}{5!} - cdots$ for all $x$?












8














I'm pretty convinced that the Taylor Series (or better: Maclaurin Series):
$$sin{x} = x -frac{x^3}{3!} + frac{x^5}{5!} - cdots$$



Is exactly equal the sine function at $x=0$


I'm also pretty sure that this function converges for all $x$




What I'm not sure is why this series is exactly equal to the sine function for all $x$.


I know exactly how to derive this expression, but in the process, it's not clear that this will be equal the sine function everywhere. Convergence does not mean this will be equal, it just mean that it will have a defined value for all $x$.


Also, I would want to know: is this valid for values greater than $frac{pi}{2}$? I mean, don't know how I can proof that this Works for values greater than the natural definition of sine.










share|cite|improve this question




















  • 3




    Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
    – Chris K
    Nov 22 '13 at 23:23












  • sine is defined for all real numbers
    – Adi Dani
    Nov 22 '13 at 23:24










  • The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
    – DonAntonio
    Nov 22 '13 at 23:34








  • 1




    The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
    – ex0du5
    Nov 23 '13 at 0:09






  • 2




    This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
    – Christian Blatter
    Nov 23 '13 at 19:07
















8














I'm pretty convinced that the Taylor Series (or better: Maclaurin Series):
$$sin{x} = x -frac{x^3}{3!} + frac{x^5}{5!} - cdots$$



Is exactly equal the sine function at $x=0$


I'm also pretty sure that this function converges for all $x$




What I'm not sure is why this series is exactly equal to the sine function for all $x$.


I know exactly how to derive this expression, but in the process, it's not clear that this will be equal the sine function everywhere. Convergence does not mean this will be equal, it just mean that it will have a defined value for all $x$.


Also, I would want to know: is this valid for values greater than $frac{pi}{2}$? I mean, don't know how I can proof that this Works for values greater than the natural definition of sine.










share|cite|improve this question




















  • 3




    Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
    – Chris K
    Nov 22 '13 at 23:23












  • sine is defined for all real numbers
    – Adi Dani
    Nov 22 '13 at 23:24










  • The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
    – DonAntonio
    Nov 22 '13 at 23:34








  • 1




    The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
    – ex0du5
    Nov 23 '13 at 0:09






  • 2




    This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
    – Christian Blatter
    Nov 23 '13 at 19:07














8












8








8







I'm pretty convinced that the Taylor Series (or better: Maclaurin Series):
$$sin{x} = x -frac{x^3}{3!} + frac{x^5}{5!} - cdots$$



Is exactly equal the sine function at $x=0$


I'm also pretty sure that this function converges for all $x$




What I'm not sure is why this series is exactly equal to the sine function for all $x$.


I know exactly how to derive this expression, but in the process, it's not clear that this will be equal the sine function everywhere. Convergence does not mean this will be equal, it just mean that it will have a defined value for all $x$.


Also, I would want to know: is this valid for values greater than $frac{pi}{2}$? I mean, don't know how I can proof that this Works for values greater than the natural definition of sine.










share|cite|improve this question















I'm pretty convinced that the Taylor Series (or better: Maclaurin Series):
$$sin{x} = x -frac{x^3}{3!} + frac{x^5}{5!} - cdots$$



Is exactly equal the sine function at $x=0$


I'm also pretty sure that this function converges for all $x$




What I'm not sure is why this series is exactly equal to the sine function for all $x$.


I know exactly how to derive this expression, but in the process, it's not clear that this will be equal the sine function everywhere. Convergence does not mean this will be equal, it just mean that it will have a defined value for all $x$.


Also, I would want to know: is this valid for values greater than $frac{pi}{2}$? I mean, don't know how I can proof that this Works for values greater than the natural definition of sine.







calculus taylor-expansion






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 26 at 16:24









Shaun

8,507113580




8,507113580










asked Nov 22 '13 at 23:18









user108425

265313




265313








  • 3




    Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
    – Chris K
    Nov 22 '13 at 23:23












  • sine is defined for all real numbers
    – Adi Dani
    Nov 22 '13 at 23:24










  • The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
    – DonAntonio
    Nov 22 '13 at 23:34








  • 1




    The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
    – ex0du5
    Nov 23 '13 at 0:09






  • 2




    This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
    – Christian Blatter
    Nov 23 '13 at 19:07














  • 3




    Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
    – Chris K
    Nov 22 '13 at 23:23












  • sine is defined for all real numbers
    – Adi Dani
    Nov 22 '13 at 23:24










  • The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
    – DonAntonio
    Nov 22 '13 at 23:34








  • 1




    The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
    – ex0du5
    Nov 23 '13 at 0:09






  • 2




    This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
    – Christian Blatter
    Nov 23 '13 at 19:07








3




3




Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
– Chris K
Nov 22 '13 at 23:23






Look up Taylor's Theorem (also called Remainder Estimation Theorem). See en.wikipedia.org/wiki/Taylor%27s_theorem. It discusses the kind of error you get on the expansion of let's say $n$ terms. Essentially, since the error bound can always be made smaller by summing up more terms, the series converges to the function.
– Chris K
Nov 22 '13 at 23:23














sine is defined for all real numbers
– Adi Dani
Nov 22 '13 at 23:24




sine is defined for all real numbers
– Adi Dani
Nov 22 '13 at 23:24












The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
– DonAntonio
Nov 22 '13 at 23:34






The Taylor series for $;sin x;,;cos x;$ converge in the whole real line, so the series do represent the function for any value of $;xinBbb R;$
– DonAntonio
Nov 22 '13 at 23:34






1




1




The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
– ex0du5
Nov 23 '13 at 0:09




The easiest reason is perhaps to note that it obeys the same differential equation with the same initial condition. Apply your favorite uniqueness theorem (Picard-Lindelof should be fine if you decompose properly).
– ex0du5
Nov 23 '13 at 0:09




2




2




This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
– Christian Blatter
Nov 23 '13 at 19:07




This question cannot be answered without knowing the rules of the game. What is your definition of $sin>$?
– Christian Blatter
Nov 23 '13 at 19:07










1 Answer
1






active

oldest

votes


















9














First, let's take the Taylor's polynomial $displaystyle T_n(x) = sumlimits_{k=0}^{n}frac{f^{(k)}(a)}{k!}(x-a)^k$ at a given point $a$. We can now say:
$displaystyle R_n(x) = f(x) - T_n(x)$, where $R_n$ can be called the remainder function.



If we can prove that $limlimits_{nrightarrow infty}R_n(x) = 0$, then $f(x) = limlimits_{nrightarrowinfty}T_n(x)$, that is, Taylors series is exactly equal to the function. Now we can use the Taylor's theorem:



$mathbf{Theorem,, 1}$ If a function is $n+1$ times differentiable in an interval $I$ that contains the point $x=a$, then for $x in I$ there exists $z$ that is strictly between $a$ and $x$, such that:



$displaystyle R_n = frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{(n+1)}$. (This is known as the Lagrange remainder.)$hspace{1cm}blacksquare$



Ok, now for every Taylor's Series we want, we must prove that $limlimits_{nrightarrow infty}R_n(x) = 0$ in order for the series to be exactly equal to the function.







Example: Prove that for $f(x) = sin(x)$, Maclaurin's series (Taylor's series with $a=0$) is exactly equal to the function for all $x$.



First, we know that $displaystyleleft|, f^{(n+1)}(z)right| leq 1$, because $left|sin(x)right|leq 1$, and $|cos(x)|leq 1$. So we have:



$displaystyle 0leq left|R_nright| = frac{left|f^{(n+1)}(z)right|}{(n+1)!}left|xright|^{n+1} leq frac{left|xright|^{n+1}}{(n+1)!}$



It's easy to prove that $displaystylelimlimits_{nrightarrowinfty}frac{left|xright|^n}{n!} = 0$ for all $xin mathbb{R}$ (for example with D'Alambert ratio criterion for series convergence) Therefore, according to the squeeze theorem (In my country we call it the Two Policemen and the Burglar theorem :D), it follows that $left|R_nright| rightarrow 0$ when $nrightarrowinfty$. This in turn is equivalent to:



$displaystyle R_nrightarrow 0$ as $nrightarrowinfty$ (because $left|R_n - 0right| = left|left|R_nright| - 0right|$).



Therefore, according to the Theorem 1, it holds that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $x$ in the radius of convergence.



**



Now, let's find the radius of convergence:



$mathbf{Theorem,, 2}$ For a Maclaurin Series of a function:



$f(x) = displaystylesumlimits_{k=0}^{+infty}c_k x^k = c_0 + c_1x + c_2x^2 + dots$



The equality holds for $x$ in the radius of convergence:



$displaystylefrac{1}{R} = limsuplimits_{k rightarrow infty} left|c_kright|^{large{frac{1}{k}}}$ (Cauchy-Hadamard formula)



where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $displaystyleleft|,x, right| < R$, where $Rin [0,+infty]$. $hspace{1cm}blacksquare$



Let's look at our given example for $sin(x)$. We have:



$displaystylesin(x) = x - frac{x^3}{3!} + frac{x^5}{5!} - dots = sumlimits_{k=0}^{+infty}c_k x^{2k+1}$



where $displaystyle c_k = frac{(-1)^k}{(2k+1)!}$.



Substituting for $R$ we get:



$displaystyle frac{1}{R} = limsuplimits_{krightarrowinfty}frac{1}{sqrt[k]{(2k+1)!}}=0,$ (this can be proven with Stirling's approximation) and



$displaystyleimplies boxed{R = +infty}$



So, our series converges for all $displaystyle|x| < +infty , Longleftrightarrow, boxed{ -infty < x < +infty}$, which we expected.



Finally, we have that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $xinmathbb{R}$, which is what we wanted to prove. $blacksquare$






share|cite|improve this answer



















  • 3




    But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
    – Lubin
    Nov 23 '13 at 2:32










  • I edited it, the answer should be complete now.
    – Vidak
    Nov 30 '13 at 12:42











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f577676%2fwhy-is-sinx-x-fracx33-fracx55-cdots-for-all-x%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









9














First, let's take the Taylor's polynomial $displaystyle T_n(x) = sumlimits_{k=0}^{n}frac{f^{(k)}(a)}{k!}(x-a)^k$ at a given point $a$. We can now say:
$displaystyle R_n(x) = f(x) - T_n(x)$, where $R_n$ can be called the remainder function.



If we can prove that $limlimits_{nrightarrow infty}R_n(x) = 0$, then $f(x) = limlimits_{nrightarrowinfty}T_n(x)$, that is, Taylors series is exactly equal to the function. Now we can use the Taylor's theorem:



$mathbf{Theorem,, 1}$ If a function is $n+1$ times differentiable in an interval $I$ that contains the point $x=a$, then for $x in I$ there exists $z$ that is strictly between $a$ and $x$, such that:



$displaystyle R_n = frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{(n+1)}$. (This is known as the Lagrange remainder.)$hspace{1cm}blacksquare$



Ok, now for every Taylor's Series we want, we must prove that $limlimits_{nrightarrow infty}R_n(x) = 0$ in order for the series to be exactly equal to the function.







Example: Prove that for $f(x) = sin(x)$, Maclaurin's series (Taylor's series with $a=0$) is exactly equal to the function for all $x$.



First, we know that $displaystyleleft|, f^{(n+1)}(z)right| leq 1$, because $left|sin(x)right|leq 1$, and $|cos(x)|leq 1$. So we have:



$displaystyle 0leq left|R_nright| = frac{left|f^{(n+1)}(z)right|}{(n+1)!}left|xright|^{n+1} leq frac{left|xright|^{n+1}}{(n+1)!}$



It's easy to prove that $displaystylelimlimits_{nrightarrowinfty}frac{left|xright|^n}{n!} = 0$ for all $xin mathbb{R}$ (for example with D'Alambert ratio criterion for series convergence) Therefore, according to the squeeze theorem (In my country we call it the Two Policemen and the Burglar theorem :D), it follows that $left|R_nright| rightarrow 0$ when $nrightarrowinfty$. This in turn is equivalent to:



$displaystyle R_nrightarrow 0$ as $nrightarrowinfty$ (because $left|R_n - 0right| = left|left|R_nright| - 0right|$).



Therefore, according to the Theorem 1, it holds that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $x$ in the radius of convergence.



**



Now, let's find the radius of convergence:



$mathbf{Theorem,, 2}$ For a Maclaurin Series of a function:



$f(x) = displaystylesumlimits_{k=0}^{+infty}c_k x^k = c_0 + c_1x + c_2x^2 + dots$



The equality holds for $x$ in the radius of convergence:



$displaystylefrac{1}{R} = limsuplimits_{k rightarrow infty} left|c_kright|^{large{frac{1}{k}}}$ (Cauchy-Hadamard formula)



where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $displaystyleleft|,x, right| < R$, where $Rin [0,+infty]$. $hspace{1cm}blacksquare$



Let's look at our given example for $sin(x)$. We have:



$displaystylesin(x) = x - frac{x^3}{3!} + frac{x^5}{5!} - dots = sumlimits_{k=0}^{+infty}c_k x^{2k+1}$



where $displaystyle c_k = frac{(-1)^k}{(2k+1)!}$.



Substituting for $R$ we get:



$displaystyle frac{1}{R} = limsuplimits_{krightarrowinfty}frac{1}{sqrt[k]{(2k+1)!}}=0,$ (this can be proven with Stirling's approximation) and



$displaystyleimplies boxed{R = +infty}$



So, our series converges for all $displaystyle|x| < +infty , Longleftrightarrow, boxed{ -infty < x < +infty}$, which we expected.



Finally, we have that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $xinmathbb{R}$, which is what we wanted to prove. $blacksquare$






share|cite|improve this answer



















  • 3




    But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
    – Lubin
    Nov 23 '13 at 2:32










  • I edited it, the answer should be complete now.
    – Vidak
    Nov 30 '13 at 12:42
















9














First, let's take the Taylor's polynomial $displaystyle T_n(x) = sumlimits_{k=0}^{n}frac{f^{(k)}(a)}{k!}(x-a)^k$ at a given point $a$. We can now say:
$displaystyle R_n(x) = f(x) - T_n(x)$, where $R_n$ can be called the remainder function.



If we can prove that $limlimits_{nrightarrow infty}R_n(x) = 0$, then $f(x) = limlimits_{nrightarrowinfty}T_n(x)$, that is, Taylors series is exactly equal to the function. Now we can use the Taylor's theorem:



$mathbf{Theorem,, 1}$ If a function is $n+1$ times differentiable in an interval $I$ that contains the point $x=a$, then for $x in I$ there exists $z$ that is strictly between $a$ and $x$, such that:



$displaystyle R_n = frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{(n+1)}$. (This is known as the Lagrange remainder.)$hspace{1cm}blacksquare$



Ok, now for every Taylor's Series we want, we must prove that $limlimits_{nrightarrow infty}R_n(x) = 0$ in order for the series to be exactly equal to the function.







Example: Prove that for $f(x) = sin(x)$, Maclaurin's series (Taylor's series with $a=0$) is exactly equal to the function for all $x$.



First, we know that $displaystyleleft|, f^{(n+1)}(z)right| leq 1$, because $left|sin(x)right|leq 1$, and $|cos(x)|leq 1$. So we have:



$displaystyle 0leq left|R_nright| = frac{left|f^{(n+1)}(z)right|}{(n+1)!}left|xright|^{n+1} leq frac{left|xright|^{n+1}}{(n+1)!}$



It's easy to prove that $displaystylelimlimits_{nrightarrowinfty}frac{left|xright|^n}{n!} = 0$ for all $xin mathbb{R}$ (for example with D'Alambert ratio criterion for series convergence) Therefore, according to the squeeze theorem (In my country we call it the Two Policemen and the Burglar theorem :D), it follows that $left|R_nright| rightarrow 0$ when $nrightarrowinfty$. This in turn is equivalent to:



$displaystyle R_nrightarrow 0$ as $nrightarrowinfty$ (because $left|R_n - 0right| = left|left|R_nright| - 0right|$).



Therefore, according to the Theorem 1, it holds that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $x$ in the radius of convergence.



**



Now, let's find the radius of convergence:



$mathbf{Theorem,, 2}$ For a Maclaurin Series of a function:



$f(x) = displaystylesumlimits_{k=0}^{+infty}c_k x^k = c_0 + c_1x + c_2x^2 + dots$



The equality holds for $x$ in the radius of convergence:



$displaystylefrac{1}{R} = limsuplimits_{k rightarrow infty} left|c_kright|^{large{frac{1}{k}}}$ (Cauchy-Hadamard formula)



where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $displaystyleleft|,x, right| < R$, where $Rin [0,+infty]$. $hspace{1cm}blacksquare$



Let's look at our given example for $sin(x)$. We have:



$displaystylesin(x) = x - frac{x^3}{3!} + frac{x^5}{5!} - dots = sumlimits_{k=0}^{+infty}c_k x^{2k+1}$



where $displaystyle c_k = frac{(-1)^k}{(2k+1)!}$.



Substituting for $R$ we get:



$displaystyle frac{1}{R} = limsuplimits_{krightarrowinfty}frac{1}{sqrt[k]{(2k+1)!}}=0,$ (this can be proven with Stirling's approximation) and



$displaystyleimplies boxed{R = +infty}$



So, our series converges for all $displaystyle|x| < +infty , Longleftrightarrow, boxed{ -infty < x < +infty}$, which we expected.



Finally, we have that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $xinmathbb{R}$, which is what we wanted to prove. $blacksquare$






share|cite|improve this answer



















  • 3




    But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
    – Lubin
    Nov 23 '13 at 2:32










  • I edited it, the answer should be complete now.
    – Vidak
    Nov 30 '13 at 12:42














9












9








9






First, let's take the Taylor's polynomial $displaystyle T_n(x) = sumlimits_{k=0}^{n}frac{f^{(k)}(a)}{k!}(x-a)^k$ at a given point $a$. We can now say:
$displaystyle R_n(x) = f(x) - T_n(x)$, where $R_n$ can be called the remainder function.



If we can prove that $limlimits_{nrightarrow infty}R_n(x) = 0$, then $f(x) = limlimits_{nrightarrowinfty}T_n(x)$, that is, Taylors series is exactly equal to the function. Now we can use the Taylor's theorem:



$mathbf{Theorem,, 1}$ If a function is $n+1$ times differentiable in an interval $I$ that contains the point $x=a$, then for $x in I$ there exists $z$ that is strictly between $a$ and $x$, such that:



$displaystyle R_n = frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{(n+1)}$. (This is known as the Lagrange remainder.)$hspace{1cm}blacksquare$



Ok, now for every Taylor's Series we want, we must prove that $limlimits_{nrightarrow infty}R_n(x) = 0$ in order for the series to be exactly equal to the function.







Example: Prove that for $f(x) = sin(x)$, Maclaurin's series (Taylor's series with $a=0$) is exactly equal to the function for all $x$.



First, we know that $displaystyleleft|, f^{(n+1)}(z)right| leq 1$, because $left|sin(x)right|leq 1$, and $|cos(x)|leq 1$. So we have:



$displaystyle 0leq left|R_nright| = frac{left|f^{(n+1)}(z)right|}{(n+1)!}left|xright|^{n+1} leq frac{left|xright|^{n+1}}{(n+1)!}$



It's easy to prove that $displaystylelimlimits_{nrightarrowinfty}frac{left|xright|^n}{n!} = 0$ for all $xin mathbb{R}$ (for example with D'Alambert ratio criterion for series convergence) Therefore, according to the squeeze theorem (In my country we call it the Two Policemen and the Burglar theorem :D), it follows that $left|R_nright| rightarrow 0$ when $nrightarrowinfty$. This in turn is equivalent to:



$displaystyle R_nrightarrow 0$ as $nrightarrowinfty$ (because $left|R_n - 0right| = left|left|R_nright| - 0right|$).



Therefore, according to the Theorem 1, it holds that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $x$ in the radius of convergence.



**



Now, let's find the radius of convergence:



$mathbf{Theorem,, 2}$ For a Maclaurin Series of a function:



$f(x) = displaystylesumlimits_{k=0}^{+infty}c_k x^k = c_0 + c_1x + c_2x^2 + dots$



The equality holds for $x$ in the radius of convergence:



$displaystylefrac{1}{R} = limsuplimits_{k rightarrow infty} left|c_kright|^{large{frac{1}{k}}}$ (Cauchy-Hadamard formula)



where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $displaystyleleft|,x, right| < R$, where $Rin [0,+infty]$. $hspace{1cm}blacksquare$



Let's look at our given example for $sin(x)$. We have:



$displaystylesin(x) = x - frac{x^3}{3!} + frac{x^5}{5!} - dots = sumlimits_{k=0}^{+infty}c_k x^{2k+1}$



where $displaystyle c_k = frac{(-1)^k}{(2k+1)!}$.



Substituting for $R$ we get:



$displaystyle frac{1}{R} = limsuplimits_{krightarrowinfty}frac{1}{sqrt[k]{(2k+1)!}}=0,$ (this can be proven with Stirling's approximation) and



$displaystyleimplies boxed{R = +infty}$



So, our series converges for all $displaystyle|x| < +infty , Longleftrightarrow, boxed{ -infty < x < +infty}$, which we expected.



Finally, we have that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $xinmathbb{R}$, which is what we wanted to prove. $blacksquare$






share|cite|improve this answer














First, let's take the Taylor's polynomial $displaystyle T_n(x) = sumlimits_{k=0}^{n}frac{f^{(k)}(a)}{k!}(x-a)^k$ at a given point $a$. We can now say:
$displaystyle R_n(x) = f(x) - T_n(x)$, where $R_n$ can be called the remainder function.



If we can prove that $limlimits_{nrightarrow infty}R_n(x) = 0$, then $f(x) = limlimits_{nrightarrowinfty}T_n(x)$, that is, Taylors series is exactly equal to the function. Now we can use the Taylor's theorem:



$mathbf{Theorem,, 1}$ If a function is $n+1$ times differentiable in an interval $I$ that contains the point $x=a$, then for $x in I$ there exists $z$ that is strictly between $a$ and $x$, such that:



$displaystyle R_n = frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{(n+1)}$. (This is known as the Lagrange remainder.)$hspace{1cm}blacksquare$



Ok, now for every Taylor's Series we want, we must prove that $limlimits_{nrightarrow infty}R_n(x) = 0$ in order for the series to be exactly equal to the function.







Example: Prove that for $f(x) = sin(x)$, Maclaurin's series (Taylor's series with $a=0$) is exactly equal to the function for all $x$.



First, we know that $displaystyleleft|, f^{(n+1)}(z)right| leq 1$, because $left|sin(x)right|leq 1$, and $|cos(x)|leq 1$. So we have:



$displaystyle 0leq left|R_nright| = frac{left|f^{(n+1)}(z)right|}{(n+1)!}left|xright|^{n+1} leq frac{left|xright|^{n+1}}{(n+1)!}$



It's easy to prove that $displaystylelimlimits_{nrightarrowinfty}frac{left|xright|^n}{n!} = 0$ for all $xin mathbb{R}$ (for example with D'Alambert ratio criterion for series convergence) Therefore, according to the squeeze theorem (In my country we call it the Two Policemen and the Burglar theorem :D), it follows that $left|R_nright| rightarrow 0$ when $nrightarrowinfty$. This in turn is equivalent to:



$displaystyle R_nrightarrow 0$ as $nrightarrowinfty$ (because $left|R_n - 0right| = left|left|R_nright| - 0right|$).



Therefore, according to the Theorem 1, it holds that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $x$ in the radius of convergence.



**



Now, let's find the radius of convergence:



$mathbf{Theorem,, 2}$ For a Maclaurin Series of a function:



$f(x) = displaystylesumlimits_{k=0}^{+infty}c_k x^k = c_0 + c_1x + c_2x^2 + dots$



The equality holds for $x$ in the radius of convergence:



$displaystylefrac{1}{R} = limsuplimits_{k rightarrow infty} left|c_kright|^{large{frac{1}{k}}}$ (Cauchy-Hadamard formula)



where $R$ is the radius of convergence of the given function, or more concretely, the power series converges to the function for all $x$ that satisfies: $displaystyleleft|,x, right| < R$, where $Rin [0,+infty]$. $hspace{1cm}blacksquare$



Let's look at our given example for $sin(x)$. We have:



$displaystylesin(x) = x - frac{x^3}{3!} + frac{x^5}{5!} - dots = sumlimits_{k=0}^{+infty}c_k x^{2k+1}$



where $displaystyle c_k = frac{(-1)^k}{(2k+1)!}$.



Substituting for $R$ we get:



$displaystyle frac{1}{R} = limsuplimits_{krightarrowinfty}frac{1}{sqrt[k]{(2k+1)!}}=0,$ (this can be proven with Stirling's approximation) and



$displaystyleimplies boxed{R = +infty}$



So, our series converges for all $displaystyle|x| < +infty , Longleftrightarrow, boxed{ -infty < x < +infty}$, which we expected.



Finally, we have that $f(x) = limlimits_{nrightarrowinfty}T_n$, for all $xinmathbb{R}$, which is what we wanted to prove. $blacksquare$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 23 '13 at 19:14

























answered Nov 23 '13 at 1:58









Vidak

25729




25729








  • 3




    But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
    – Lubin
    Nov 23 '13 at 2:32










  • I edited it, the answer should be complete now.
    – Vidak
    Nov 30 '13 at 12:42














  • 3




    But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
    – Lubin
    Nov 23 '13 at 2:32










  • I edited it, the answer should be complete now.
    – Vidak
    Nov 30 '13 at 12:42








3




3




But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
– Lubin
Nov 23 '13 at 2:32




But the question is not whether the series is everywhere convergent, but whether the function it converges to actually is the sine function.
– Lubin
Nov 23 '13 at 2:32












I edited it, the answer should be complete now.
– Vidak
Nov 30 '13 at 12:42




I edited it, the answer should be complete now.
– Vidak
Nov 30 '13 at 12:42


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f577676%2fwhy-is-sinx-x-fracx33-fracx55-cdots-for-all-x%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How do I know what Microsoft account the skydrive app is syncing to?

When does type information flow backwards in C++?

Grease: Live!