Week 8: Power Series (Continued)

Since we now know that a power series defines a continuous function within its region of convergence, one might wonder if the reverse is true, i.e. if any continuous function can be written as a power series. This is where Taylor series prove to be extremely useful.
Definition:
For a continuous function f: \mathbb{C} \to \mathbb{C}, define its Taylor series expansion around a to be the power series

    \[f(z) = \sum_{n=0} a_n (z-a)^n,\]

where a_n = \frac{f^{(n)}(a)}{n!}. Then, the Taylor series for a is the unique power series representation for f at a that converges to f inside B(a,R), where R is the radius of convergence.
Examples:
Recall that the Taylor series for e^x = 1 + x + \frac{x^2}{2!}+... = \sum_{n=0}^\infty \frac{1}{n!} x^n. Thus, one defines the complex exponential

    \[e^z = \sum_{n=0}^\infty \frac{1}{n!},\]

with radius of convergence that can be determined by the root test to be

    \[\lim_{n \to \infty} \frac{\frac{1}{n!}}{\frac{1}{(n+1)!}} = \lim_{n \to \infty} n+1 = \infty.\]

Thus, the power series converges everywhere on the complex plane to e^z. The function obeys the same properties as the real exponential, namely,

    \[e^{z_1+z_2} = e^{z_1}e^{z_2}.\]

The complex logarithm can be similarly defined as

    \[\log (1-z) = \sum_{n=1}^\infty \frac{z^n}{n},\]

which converges for |z|<1. Similarly, one defines complex sine and cosine through their power series to be

    \[\sin z = \sum_{n=0}^\infty \frac{(-1)^{n+1} z^{2n+1}}{(2n+1)!}, \cos z = \sum_{n=0}^\infty \frac{(-1)^n z^{2n}}{(2n)!},\]

which can also be checked to converge everywhere on \mathbb{C}. One is then also able to derive Euler’s formula using

    \[r(\cos \theta + i \sin \theta) = r\left(\sum_{n=0}^\infty \frac{(-1)^{n+1} \theta^{2n+1}}{(2n+1)!} + i\sum_{n=0}^\infty \frac{(-1)^n \theta^{2n}}{(2n)!}\right) = r\sum_{n=0}^\infty \frac{(i\theta)^n}{n!} = re^{i \theta}.\]


We noticed that a power series defines a continuous function, but is that function differentiable, and if so, how many derivatives can you take? What about integration?
Proposition:
Suppose f(z) = \sum_{n=0}^\infty a_n (z-a)^n is a complex power series with radius of convergence R. Then, for any 0<r<R, f(z) is differentiable and Riemann integrable on \overline{B(a,r)}, with

    \[f'(z) = \sum_{n=1}^\infty n a_n (z-a)^{n-1}, \quad \int_a^z f(y) dy =  \sum_{n=0}^\infty a_n \frac{(z-a)^{n+1}}{n+1}.\]

Proof:
Note that the radius of convergence for both series is the same

    \[R=\frac{1}{\limsup_{n \to \infty}|a_n|^{\frac{1}{n}}}=\frac{1}{\limsup_{n \to \infty} |na_n|^{\frac{1}{n}}} = \frac{1}{\limsup_{n \to \infty} |\frac{a_n}{n+1}|^{\frac{1}{n}}},\]

since \lim_{n \to \infty} n^{\frac{1}{n}} = \lim_{n \to \infty} (\frac{1}{n+1})^{\frac{1}{n}}=1. For the Riemann integral of f, we use the fact that the uniform limit of Riemann integrable functions is Riemann integrable, obtaining that the power series

    \[\sum_{n=0}^k \int_a^z a_n (y-a)^n dy = \int_a^z \sum_{n=0}^k a_n (y-a)^n dy \to \int \sum_{n=0}^\infty a_n (y-a)^n dy = \int_a^z f(y)dy,\]

i.e.

    \[\sum_{n=0}^\infty a_n \frac{(z-a)^{n+1}}{n+1} = \int_a^z f(y) dy.\]

For the derivative of f, we use Theorem 3.7.1 in the textbook, which states that if f_n' \to g uniformly and f_n(x) \to G(x) for at least one x, then f_n \to G uniformly, G is differentiable, and G'=g. Note that the first part of the theorem follows immediately from the uniform convergence for the Riemann integral and an application of the Fundamental Theorem of Calculus. We note that the derivative series converges uniformly to some function g, and the series for f converges for at least one point in the region of convergence, so by applying Theorem 3.7.1, we get that f is differentiable with

    \[g(z) = f'(z) \quad \forall z \in \overline{B(a,r)}.\]

Corollary:
Any power series defines an infinitely differentiable function, all of whose derivatives are continuous and which converge uniformly in the same region of convergence (such functions are called smooth or C^\infty functions).
Remark:
A natural question to ask is whether all smooth functions can be defined by a power series. Surprisingly, the answer is no. For example, the function f: \mathbb{R} \to \mathbb{R} given by

    \[f(x) =         \begin{cases}         e^{-\frac{1}{x}}, & x > 0 \\         0, & x<0\\         \end{cases}\]

is smooth but its Taylor series at x=0 has radius of convergence 0. However, power series precisely correspond to functions that are “smooth” in the complex plane, called holomorphic functions, which satisfy very peculiar and nice properties and are beyond the scope of this course.
Finally, I want to state a test (without proof) that lets one check the continuity of a power series on the boundary of the region of convergence, known as Abel’s Test.
Theorem (Abel’s Test):
Let f(z)=\sum_{n=0}^\infty a_n (x-a)^n be a real power series with radius of convergence R. If the power series converges for some x_0 \in \mathbb{R} such that |x_0-a|=R, then f is continuous at x_0.