# Taylor Series **Taylor series is a tool used to accomplish [[Polynomials|polynomial]] interpolation of [[Function (in Algebra)|any function]]. It takes a hard-to-compute function and spits out an easy to calculate polynomial, which is an ==approximation of the original function with a certain accuracy. It works by looking the rate of change of $f$ around $a$, combined with the rate of change of the rate of change of $f$ around $a$ (the second derivative), and so on== ad infinitum.** We can tweak the accuracy by varying the actual number of elements we take into account and discarding the rest. ^qnnyu9 **Taylor series works best if values are close to a certain value (denoted as $a$). It is said that the series is centered around said value.** If $a=0$, that is if the series is centered around the $y$ axis, the series is also known as the Maclaurin series. $ T_f (a,x)=\sum_{k=0}^\infty \frac{f^{(k)}(a)}{k!}(x-a)^k $ Where $f^{(k)}$ is the $k$-th [[Derivative]] of $f(x)$ and $k!$ is $k$ [[Factorial]]. A linear approximation of a function is just its Taylor series evaluated with the first two elements so $L = f(a) + f'(a)(x-a)$. We can also then define quadratic approximation to be the function's Taylor series with the first three elements. A way to reason about what the series does is to peel-off a couple of its first elements, and to think about the link between them: $ T_f (a,x)= f(a) + f'(a)(x-a) + \frac 1 2 f''(a)(x-a)^2 + ... $ - The first element is just the function's value at $a$. It sets a base point. - $f'$ is the rate of change of $f$ with respect to a change of $x$. Think about this as speed and of $(x-a)$ as the time passed since $a$. - To continue with the motion analogy, $f''$ is then acceleration, as it is corresponds to the rate of change of the rate of change of $f$. - The next element will account for a change of acceleration. But if we know the acceleration is constant, or if we believe it will have little impact, we can also just stop here. This allows for use of the Taylor series without evaluating the sum to infinity. The series is convergent for all polynomials (but not necessarily for all functions), because the number of its non-zero elements is less or equal the polynomial's power. In fact evaluating the series until it converges will lead to the polynomial itself. ## Common Taylor Series $ \begin{gather} e^x = \sum_{k=0}^\infty \frac {x^k} {k!} \\ \cos x = \sum_{k=0}^\infty (-1)^k \frac {x^{2k}}{(2k)!} = 1 - \frac 1 2 x^2 + \frac 1 {24}x^4 - \frac 1 {720}x^6 + \dots \\ \sin x = \sum_{k=0}^\infty (-1)^k \frac {x^{2k + 1}}{(2k+1)!} \\ \cosh x = \sum_{k=0}^\infty \frac {x^{2k}}{(2k)!} \\ \sinh x = \sum_{k=0}^\infty \frac {x^{2k + 1}}{(2k+1)!} \\ \frac 1 {1-x} = \sum_{k=0}^\infty x^k \; \text{for } |x| < 1 \\ \ln(1+x) = \sum_{k=0}^\infty (-1)^{k+1} \frac {x^k} k \; \text{for } |x| < 1 \\ \arctan x = \sum_{k=0}^\infty (-1)^k \frac {x^{2k + 1}}{2k+1} \; \text{for } |x| < 1 \\ (1+x)^j= \sum_{k=0}^\infty \binom j k x^k \; \text{for } |x| < 1 \\ \end{gather} $