Life is a math equation. In order to gain the most, you have to know how to convert negatives into positives, Anonymous

Definition. Complex sequence A sequence of complex numbers is a function $a: \mathbb{N} \to \mathbb{C}$. We usually denote it by $(a_n)_{n \in \mathbb{N}}$ or simply $(a_n)$, where $a_n := a(n)$. The value $a_1$ is called the first term of the sequence, $a_2$ the second term, and in general $a_n$ the n-th term of the sequence.
Definition. Convergent complex sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is said to converge to a complex number $L \in \mathbb{C}$ if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n \geq N$ one has $|a_n - L| < \varepsilon$. In this case we write $\lim_{n \to \infty} a_n = L$ or $a_n \to L$ as $n \to \infty$, and L is called the limit of the sequence $(a_n)_{n \in \mathbb{N}}$.
Definition. Cauchy sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is called a Cauchy sequence if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n, m \geq N$ one has $|a_n - a_m| < \varepsilon$.
Definition. Series and partial sums.Let $(a_n)_{n \in \mathbb{N}}$ be a complex sequence. For each n $\in \mathbb{N}$, the finite sum $s_n := a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ is called the n-th partial sum of the (infinite) series $\sum_{k=1}^\infin a_k$ which we also denote simply by $\sum a_n$ when the index is clear from the context.
Definition. Convergent series. The series $\sum_{n=1}^{\infty} a_n$ is said to converge to the sum $s \in \mathbb{C}$ if the sequence of partial sums $(s_n)_{n \in \mathbb{N}}$ defined by $s_n = a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ converges to s, that is, $\lim_{n \to \infty} s_n = s$. In this case we write $s := \sum_{n=1}^\infin a_n$. If the sequence $(s_n)_{n \in \mathbb{N}}$ does not converge, we say that the series $\sum_{n=1}^{\infty} a_n$ diverges (or does not converge).
Definition. A complex power series centered at 0 in the variable z is a series of the form $a_0 + a_1z + a_2z^2 + \cdots = \sum_{n=0}^\infty a_n z^n$ with coefficients $a_i \in \mathbb{C}$
Definition. A complex power series centered at a complex number $a \in \mathbb{C} $ is an infinite series of the form: $\sum_{n=0}^\infty a_n (z - a)^n,$ where each $a_n \in \mathbb{C}$ is a coefficient, z is a complex variable, and $(z - a)^n$ is the nth power about the center.
Theorem. Given a power series $\sum_{n=0}^\infty a_n z^n$, there exists a unique value R, $0 \le R \le \infin$ (called the radius of convergence) such that:
On the Circle (|z| = R), this theorem gives no information. This is the yellow light zone —the series could converge or diverge.
Differentiability of Power Series. If $f(z) = \sum_{n=0}^{\infty} a_nz^n$ for |z| < R (R > 0), then f is analytic on B(0; R) and $f'(z) = \sum_{n=1}^{\infty} na_nz^{n-1}$ for |z| < R.
Weierstrass M-test. Let $\{u_k(z)\}_{k=0}^\infty$ be a sequence of complex-valued functions defined on a set $\gamma^* \subseteq \mathbb{C}$. If there exists a sequence of non-negative real numbers $\{M_k\}_{k=0}^\infty$ such that:
Then, the original series $\sum_{k=0}^\infty u_k(z)$ converges uniformly on $\gamma^*$.
Proof.
Define Partial Sums. Let $S_n(z) = \sum_{k=0}^n u_k(z)$ be the $n$-th partial sum.
Apply the Cauchy Criterion. $\{S_n(z)\}$ converges uniformly iff for every $\epsilon > 0$, there exists $N \in \mathbb{N}$ such that for all $m > n \geq N$ and all $z \in \gamma^*, |S_m(z) - S_n(z)| < \epsilon$.
$$ \begin{aligned} |S_m(z) - S_n(z)| &[\text{The difference between partial sums is just a partial tail of the series}] \\[2pt] &=\left|\sum_{k=n+1}^m u_k(z)\right| \\[2pt] &=[\text{Triangle inequality, bounding condition}] \\[2pt] &\leq \sum_{k=n+1}^m |u_k(z)| \leq \sum_{k=n+1}^m M_k \end{aligned} $$Since $\sum_{k=0}^\infty M_k$ converges, its partial sums form a Cauchy sequence of real numbers, i.e.,for our given $\epsilon > 0$, there exists N such that for all $m > n \geq N, \sum_{k=n+1}^m M_k < \epsilon$.
Putting it all together: $|S_m(z) - S_n(z)| \leq \sum_{k=n+1}^m M_k < \epsilon$. This inequality holds for all $z \in \gamma^*$ simultaneously (the N depends only on $\epsilon$, not on z). Since the partial sums $\{S_n(z)\}$ are uniformly Cauchy, they converge uniformly to some function $S(z) = \sum_{k=0}^\infty u_k(z)$ on $\gamma^*$.
Lemma 1. Let $\gamma$ be a path, let $u_k(z)$ be a sequence of continuous function on $\gamma*$. If we can find a set of positive constants $M_k$ such that:
Then, $\sum_{k=0}^\infin \int_\gamma u_k(z)dz$ converges uniformly to a continuous function U(z), and we can legally swap the integral and the sum: $\sum_{k=0}^\infin \int_\gamma u_k(z)dz = \int_\gamma \sum_{k=0}^\infin u_k(z)dz = \int_\gamma U(z)dz$.
In calculus, we know that for a finite sum, the integral of the sum is the sum of the integrals (linearity of the integral): $\int_\gamma (u_1(z) + u_2(z)) dz = \int_\gamma u_1(z) dz + \int_\gamma u_2(z) dz$. However, this is not automatically true for an infinite sum. This lemma states that we can swap them, provided the series converges “nicely” enough (Weierstrass M-test).
Proof.
By Weierstrass M-Test (Bounding Condition and Convergence of Bound are satisfied), the series converges “nicely” enough, i.e., $\sum_{k=0}^\infin u_k(z)dz$ converges uniformly to a continuous function U(z).
Why the limit function is also continuous? The Uniform Convergence and Continuity Theorem states that if a sequence of continuous functions $U_N$ converges uniformly to a limit function U, then the limit function U must also be continuous.
Is our sequence made of continuous functions?
The strategy of the proof is to show that the error (the difference) between the full integral $\int U(z)dz$ and the $N$-th partial sum of the integrals $\sum_{k=0}^N \int u_k(z)dz$ must go to zero as $N$ gets large.
Let $U(z) = \sum_{k=0}^\infty u_k(z)$ be the final sum of the series, and let $U_N(z) = \sum_{k=0}^N u_k(z)$ be the $N$-th partial sum, $N = 0, 1, \cdots$. Our goal is to prove that $\int_\gamma U(z) dz = \sum_{k=0}^\infty \int_\gamma u_k(z) dz =[\text{Definition of a series}] \lim_{N \to \infty} \sum_{k=0}^N \int_\gamma u_k(z) dz$.
We want to show that the integral of the full sum is the limit of the sum of the integrals. Let’s look at the difference or “error” between the integral of the full sum and the N-th partial sum of the integrals:
$$ \begin{aligned} Error_N &=|\int_\gamma U(z)dz - \sum_{k=0}^N \int_\gamma u_k(z)dz| \\[2pt] &\text{Because } U_N(z) \text{ is a finite sum, we can use the linearity property to pull the sum outside the integral on the right side:}\\[2pt] &=|\int_\gamma U(z)dz - \int_\gamma \sum_{k=0}^N \left(u_k(z) \right)dz| \\[2pt] &=|\int_\gamma U(z)dz - \int_\gamma U_N(z)dz| \\[2pt] &\text{Again, by linearity, we can combine the two integrals:} \\[2pt] &=|\int_\gamma (U(z) - U_N(z))dz| \\[2pt] &\text{Apply the ML-Inequality, } |\int_\gamma f(z)dz| \le M \cdot length(\gamma) \text{, where lenght(γ) is the path length of the contour and M is the maximum value of |f| on it.} \\[2pt] &\lt sup_{z \in \gamma*}\{ |U(z) - U_N(z)| \} \cdot length(\gamma) \\[2pt] &U(z) - U_N(z) \text{ is the tail or remainder of the infinite series after the N-th term.} \\[2pt] &=sup_{z \in \gamma*}\{ |\sum_{k=N+1}^\infin u_k(z) \}| \cdot length(\gamma) \\[2pt] &\text{Triangle Inequality for infinite series.} \\[2pt] &=sup_{z \in \gamma*}\{ \sum_{k=N+1}^\infin |u_k(z)| \} \cdot length(\gamma) \\[2pt] &\le \sum_{k=N+1}^\infin M_k \cdot length(\gamma). \end{aligned} $$We have found a bound for $|U(z) - U_N(z)|$ that is independent of z, $0 \le \text{Error}_N \le \text{length}(\gamma) \cdot \left( \sum_{k=N+1}^\infty M_k \right)$
We are given that $\sum_{k=0}^\infty M_k$ is a convergent series. A fundamental property of convergent series is that their tail must go to zero, $\lim_{N \to \infty} \left( \sum_{k=N+1}^\infty M_k \right) = 0$. As $N \to \infty$, both the right and left sides of our inequality goes to 0.
By the Squeeze Theorem, the error term in the middle must also go to zero: $\lim_{N \to \infty} \text{Error}_N = 0, \lim_{N \to \infty} \left| \int_\gamma U(z) dz - \sum_{k=0}^N \int_\gamma u_k(z) dz \right| = 0$
This is the formal definition of the limit, which means: $\int_\gamma U(z) dz = \lim_{N \to \infty} \sum_{k=0}^N \int_\gamma u_k(z) dz =[\text{Definition of an infinite series}] \sum_{k=0}^\infty \int_\gamma u_k(z) d$
Coefficients of power series. Let f(z) = $\sum_{k=0}^\infty c_kz^k$ where this power series has radius of convergence R > 0. Then,the n-th coefficient of a power series $c_n$ can be extracted using the integral formula, $c_n = \frac{1}{2\pi i} \int_{C_r} \frac{f(z)}{z^{n+1}}dz, 0 \le r \lt R, n \ge 0$ where $C_r$ is a circle of radius r centered at 0 and oriented positively.
Proof.
We start by taking the integral on the right-hand side and substituting the definition of $f(z) = \sum_{k=0}^\infty c_kz^k$.
$\int_{C_r} \frac{f(z)}{z^{n+1}}dz = \int_{C_r} \frac{\sum_{k=0}^\infty c_kz^k}{z^{n+1}}dz =[\text{Now, we use algebra to bring the denominator } z^{n+1} \text{ inside the summation}] \int_{C_r} \left( \sum_{k=0}^\infty c_kz^{k-n-1} \right)dz$
We want to move the $\int$ symbol inside the $\sum$ symbol. We are only allowed to do this if the series converges uniformly on the path. Lemma 1 provides the checklist to ensure we are allowed to do this.
Lemma 1. Let $\gamma$ be a path, let $u_k(z)$ be a sequence of continuous function on $\gamma*$. If we can find a set of positive constants $M_k$ such that:
Then, $\sum_{k=0}^\infin \int_\gamma u_k(z)dz$ converges uniformly to a continuous function U(z), and we can legally swap the integral and the sum: $\sum_{k=0}^\infin \int_\gamma u_k(z)dz = \int_\gamma \sum_{k=0}^\infin u_k(z)dz = \int_\gamma U(z)dz$.
If the exponent k - n - 1 is positive, the function is a polynomial, which is continuous everywhere on the complex plane. If the exponent k - n - 1 is negative, the function is a rational function (like $1/z^2$). These are continuous everywhere except at the pole z = 0. Our path $\gamma$ is the circle $C_r$, defined by $|z| = r$ where r > 0. Crucially, this path does not pass through the origin z = 0, hence every term $u_k(z)$ is continuous on the trace of $\gamma$.
Because the conditions of Lemma 1 are met, we are legally allowed to swap the integral and sum.
$\int_{C_r} \frac{f(z)}{z^{n+1}}dz = \sum_{k=0}^\infty \int_{C_r} c_kz^{k-n-1}dz = \sum_{k=0}^\infty c_k \left( \int_{C_r} z^{k-n-1}dz \right).$
Let $C_r$ be the positively oriented circle centered at 0 with radius $r > 0$, parameterized by $z(t) = re^{it}$ for $t \in [0, 2\pi]$. Then for any integer $m \in \mathbb{Z}, \int_{C_r} z^m dz = \begin{cases} 2\pi i & \text{if } m = -1 \\ 0 & \text{if } m \neq -1 \end{cases}$
We write the circle as: $z(t) = re^{it}$ where $t$ goes from $0$ to $2\pi$. The differential is: $dz = ire^{it}dt$
$\int_{C_r} z^m dz = \int_0^{2\pi} (re^{it})^m \cdot ire^{it} dt = ir^{m+1} \int_0^{2\pi} e^{i(m+1)t} dt$
When $m = -1$, we have $m + 1 = 0$, so $e^{i(m+1)t} = e^0 = 1$. $\int_{C_r} z^{-1} dz = ir^{0} \int_0^{2\pi} 1 \, dt = i \cdot 1 \cdot [t]_0^{2\pi} = i \cdot 2\pi = 2\pi i$
When $m \neq -1$, we have $m + 1 \neq 0$ and can integrate: $\int_0^{2\pi} e^{i(m+1)t} dt = \left[\frac{e^{i(m+1)t}}{i(m+1)}\right]_0^{2\pi}$
$\forall m \in \mathbb{Z}, e^{i(m+1)2\pi} = \cos(2\pi(m+1)) + i\sin(2\pi(m+1)) = 1$
$\left[\frac{e^{i(m+1)t}}{i(m+1)}\right]_0^{2\pi} = \frac{1 - 1}{i(m+1)} = 0 \leadsto \int_{C_r} z^m dz = ir^{m+1} \cdot 0 = 0$
Let’s look at our exponent: m = k - n - 1. We want to know when this exponent equals -1: $k - n - 1 = -1 \implies k = n$. The integral kills almost every term in the infinite sum. If $k \neq n$: The exponent is not -1, so the integral is 0. If k = n: The exponent is -1, so the integral is $2\pi i$.
$\int_{C_r} \frac{f(z)}{z^{n+1}}dz = \dots + c_{n-1}(0) + \mathbf{c_n(2\pi i)} + c_{n+1}(0) + \dots = 2\pi ic_n$, hence $c_n = \frac{1}{2\pi i} \int_{C_r} \frac{f(z)}{z^{n+1}}dz$.