I’d far rather be happy than right any day, Douglas Adams, The Hitchhiker’s Guide to the Galaxy

Definition. Complex sequence A sequence of complex numbers is a function $a: \mathbb{N} \to \mathbb{C}$. We usually denote it by $(a_n)_{n \in \mathbb{N}}$ or simply $(a_n)$, where $a_n := a(n)$. The value $a_1$ is called the first term of the sequence, $a_2$ the second term, and in general $a_n$ the n-th term of the sequence.
Definition. Convergent complex sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is said to converge to a complex number $L \in \mathbb{C}$ if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n \geq N$ one has $|a_n - L| < \varepsilon$. In this case we write $\lim_{n \to \infty} a_n = L$ or $a_n \to L$ as $n \to \infty$, and L is called the limit of the sequence $(a_n)_{n \in \mathbb{N}}$.
Definition. Cauchy sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is called a Cauchy sequence if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n, m \geq N$ one has $|a_n - a_m| < \varepsilon$.
Definition. Series and partial sums.Let $(a_n)_{n \in \mathbb{N}}$ be a complex sequence. For each n $\in \mathbb{N}$, the finite sum $s_n := a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ is called the n-th partial sum of the (infinite) series $\sum_{k=1}^\infin a_k$ which we also denote simply by $\sum a_n$ when the index is clear from the context.
Definition. Convergent series. The series $\sum_{n=1}^{\infty} a_n$ is said to converge to the sum $s \in \mathbb{C}$ if the sequence of partial sums $(s_n)_{n \in \mathbb{N}}$ defined by $s_n = a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ converges to s, that is, $\lim_{n \to \infty} s_n = s$. In this case we write $s := \sum_{n=1}^\infin a_n$. If the sequence $(s_n)_{n \in \mathbb{N}}$ does not converge, we say that the series $\sum_{n=1}^{\infty} a_n$ diverges (or does not converge).
Definition. A complex power series centered at 0 in the variable z is a series of the form $a_0 + a_1z + a_2z^2 + \cdots = \sum_{n=0}^\infty a_n z^n$ with coefficients $a_i \in \mathbb{C}$
Definition. A complex power series centered at a complex number $a \in \mathbb{C} $ is an infinite series of the form: $\sum_{n=0}^\infty a_n (z - a)^n,$ where each $a_n \in \mathbb{C}$ is a coefficient, z is a complex variable, and $(z - a)^n$ is the nth power about the center.
Theorem. Given a power series $\sum_{n=0}^\infty a_n z^n$, there exists a unique value R, $0 \le R \le \infin$ (called the radius of convergence) such that:
On the Circle (|z| = R), this theorem gives no information. This is the yellow light zone —the series could converge or diverge.
Differentiability of Power Series. If $f(z) = \sum_{n=0}^{\infty} a_nz^n$ for |z| < R (R > 0), then f is analytic on B(0; R) and $f'(z) = \sum_{n=1}^{\infty} na_nz^{n-1}$ for |z| < R.
Power series as a limit. A power series $f(z) = \sum_{n=0}^{\infty} a_nz^n$ defines a function by the limit of its partial sums: $f(z)= \lim_{N \to \infty} S_N(z), \text{ where } S_N(z)=\sum_{n=0}^{N}a_nz^n$.
We begin by defining a function $f(z)$ as a power series, $f(z) = \sum_{n=0}^{\infty} \frac{z^n}{n!}$. Our first step is to find out where this function is well-defined, i.e., for which complex numbers z the series converges.
We use the Ratio Test for absolute convergence. A series converges absolutely if the limit of the ratio of its terms is less than 1.
$\left| \frac{a_{n+1}(z)}{a_n(z)} \right| = \left| \frac{\frac{z^{n+1}}{(n+1)!}}{\frac{z^n}{n!}} \right| = \left| \frac{z^{n+1}}{(n+1)!} \cdot \frac{n!}{z^n} \right| = \left| \frac{z}{n+1} \right|$
Now, we take the limit as $n \to \infty, L = \lim_{n \to \infty} \left| \frac{z}{n+1} \right| =[\text{For any fixed complex number z, |z| is just a constant}] |z| \cdot \lim_{n \to \infty} \frac{1}{n+1} = |z| \cdot 0 = 0$. Since L = 0 < 1 for any complex number z, the condition for convergence, L < 1, is always satisfied, the series converges for every $z \in \mathbb{C}$ and its radius of convergence is $R = \infty$. We call such functions entire.
A fundamental theorem for power series states that we can differentiate them term by term inside their radius of convergence. Since $R = \infty$, we can do this for all of ℂ.
f’(z) = $\sum_{n=1}^{\infty} \frac{nz^{n-1}}{n!} = \sum_{n=1}^{\infty} \frac{z^{n-1}}{(n-1)!} =[\text{Readjusting the index}] \sum_{n=0}^{\infty} a_nz^n$ = f(z).
n = 0: The first term is $\frac{d}{dz}(\frac{z^0}{0!}) = \frac{d}{dz}(1) = 0$. That’s why the derivative sum starts at n = 1. By convention, 0! = 1 and in power-series $0^0 = 1$.
Besides, f(0) = $\sum_{n=0}^{\infty} \frac{0^n}{n!} = \frac{0^0}{0!} + \frac{0^1}{1!} + \frac{0^2}{2!} + \dots = 1$ and the given power series is a solution to the initial value problem defined by f’(z) = f(z) with f(0) = 1.
The Existence and Uniqueness Theorem for ODEs guarantees that this IVP has a unique solution. Since the function $e^z$ is known from real analysis to satisfy this exact same IVP, the power series f(z) must be the complex exponential $e^z = \sum_{n=0}^{\infty} \frac{z^n}{n!}$.
A key question arises: is every analytic function (a function f is analytic at a point a if it is differentiable at a and in a small disk around a) equal to a power series or representable as a power series within its domain?
The answer is nuanced. A single power series rarely describes f on its entire domain; its radius is limited by the nearest singularity. $\frac{1}{1-z}$ is well-defined and differentiable everywhere in the complex plane except at the single point z = 1 where it blows up, so its domain of analyticity is ℂ \ {1}. It can be expressed as a power series in a disk of convergence: $\frac{1}{1-z} = \sum_{n=0}^{\infty} z^n, \forall |z| \lt 1$.
The function f(z) is perfectly analytic at z = 2 (since $2 \neq 1$), but its power series $\sum z^n$ diverges there. This is not a contradiction! It just means that a single power series centered at a = 0 is not a global representation of the function. This is because the power series cannot cross or go past the problem point z = 1.
The big question is: Is every analytic function equal to some power series locally?, i.e. in some B(a; r) for r > 0 where a is a point of analyticity for f. The answer is locally, yes (Taylor’s Theorem).
Not all analytic functions can be represented by a single power series over their entire domain. However, every analytic function can be approximated by a power series in a small neighborhood around each point of analyticity.
Taylor’s Theorem. If f is analytic on an open disk B(a; R) (a disk of radius R centered at a), then f(z) can be represented exactly by a unique power series within that disk: $f(z) = \sum_{n=0}^{\infty}a_n (z - a)^n, \forall z \in B(a; R)$
It states that if a function is differentiable in a disk, it must also be infinitely differentiable and representable by a power series.
Furthermore, there exist unique constants $a_n = \frac{f^{(n)}(a)}{n!} = \frac{1}{2\pi i}\int_{C_r} \frac{f(w)}{(w-a)^{n+1}}dw$ where $C_r$ is a circle of radius r < R centered at a and oriented in the counterclockwise direction (positively oriented).
As we have demonstrated earlier, if a function $f$ is analytic, it is automatically infinitely differentiable, i.e., the expression $f^{(n)}(a)$ (the $n$-th derivative at $a$) makes sense. The Cauchy Integral Formula for Derivatives gives us a way to calculate $f^{(n)}(a)$ using an integral: $f^{(n)}(a) = \frac{n!}{2\pi i} \oint_{C_r} \frac{f(w)}{(w-a)^{n+1}} dw$