JustToThePoint English Website Version
JustToThePoint en español

Counting Zeros of Analytic Functions and Rouché's Theorem

Now I am become Death, the destroyer of worlds, Robert Oppenheimer

image info

Introduction

Definition. Complex sequence A sequence of complex numbers is a function $a: \mathbb{N} \to \mathbb{C}$. We usually denote it by $(a_n)_{n \in \mathbb{N}}$ or simply $(a_n)$, where $a_n := a(n)$. The value $a_1$ is called the first term of the sequence, $a_2$ the second term, and in general $a_n$ the n-th term of the sequence.

Definition. Convergent complex sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is said to converge to a complex number $L \in \mathbb{C}$ if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n \geq N$ one has $|a_n - L| < \varepsilon$. In this case we write $\lim_{n \to \infty} a_n = L$ or $a_n \to L$ as $n \to \infty$, and L is called the limit of the sequence $(a_n)_{n \in \mathbb{N}}$.

Definition. Cauchy sequence. A complex sequence $(a_n)_{n \in \mathbb{N}}$ is called a Cauchy sequence if for every $\varepsilon > 0$ there exists an integer $N \in \mathbb{N}$ such that for all $n, m \geq N$ one has $|a_n - a_m| < \varepsilon$.

Definition. Series and partial sums.Let $(a_n)_{n \in \mathbb{N}}$ be a complex sequence. For each n $\in \mathbb{N}$, the finite sum $s_n := a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ is called the n-th partial sum of the (infinite) series $\sum_{k=1}^\infin a_k$ which we also denote simply by $\sum a_n$ when the index is clear from the context.

Definition. Convergent series. The series $\sum_{n=1}^{\infty} a_n$ is said to converge to the sum $s \in \mathbb{C}$ if the sequence of partial sums $(s_n)_{n \in \mathbb{N}}$ defined by $s_n = a_1 + a_2 + \cdots + a_n = \sum_{k=1}^n a_k$ converges to s, that is, $\lim_{n \to \infty} s_n = s$. In this case we write $s := \sum_{n=1}^\infin a_n$. If the sequence $(s_n)_{n \in \mathbb{N}}$ does not converge, we say that the series $\sum_{n=1}^{\infty} a_n$ diverges (or does not converge).

Definition. A complex power series centered at 0 in the variable z is a series of the form $a_0 + a_1z + a_2z^2 + \cdots = \sum_{n=0}^\infty a_n z^n$ with coefficients $a_i \in \mathbb{C}$

Definition. A complex power series centered at a complex number $a \in \mathbb{C} $ is an infinite series of the form: $\sum_{n=0}^\infty a_n (z - a)^n,$ where each $a_n \in \mathbb{C}$ is a coefficient, z is a complex variable, and $(z - a)^n$ is the nth power about the center.

Theorem. Given a power series $\sum_{n=0}^\infty a_n z^n$, there exists a unique value R, $0 \le R \le \infin$ (called the radius of convergence) such that:

  1. For any z with |z| < R (inside the circle), the series $\sum_{n=0}^\infty a_n z^n$ converges absolutely (this is a “green light” zone).
  2. For any z with |z| > R, the series diverges (this is a “red light” zone).

    On the Circle (|z| = R), this theorem gives no information. This is the yellow light zone —the series could converge or diverge.

Differentiability of Power Series. If $f(z) = \sum_{n=0}^{\infty} a_nz^n$ for |z| < R (R > 0), then f is analytic on B(0; R) and $f'(z) = \sum_{n=1}^{\infty} na_nz^{n-1}$ for |z| < R.

Weierstrass M-test. Let $\{u_k(z)\}_{k=0}^\infty$ be a sequence of complex-valued functions defined on a set $\gamma^* \subseteq \mathbb{C}$. If there exists a sequence of non-negative real numbers $\{M_k\}_{k=0}^\infty$ such that:

  1. Bounding Condition: $|u_k(z)| \leq M_k$ for all $z \in \gamma^*$ and all $k \in \mathbb{N}$
  2. Convergence of Bound: The series $\sum_{k=0}^\infty M_k$ converges

Then, the original series $\sum_{k=0}^\infty u_k(z)$ converges uniformly on $\gamma^*$.

Coefficients of power series. Let f(z) = $\sum_{k=0}^\infty c_kz^k$ where this power series has radius of convergence R > 0. Then,the n-th coefficient of a power series $c_n$ can be extracted using the integral formula, $c_n = \frac{1}{2\pi i} \int_{C_r} \frac{f(z)}{z^{n+1}}dz, 0 \le r \lt R, n \ge 0$ where $C_r$ is a circle of radius r centered at 0 and oriented positively.

Taylor’s Theorem. If f is analytic on an open disk B(a; R) (a disk of radius R centered at a), then f(z) can be represented exactly by a unique power series within that disk: $f(z) = \sum_{n=0}^{\infty}a_n (z - a)^n, \forall z \in B(a; R)$

This theorem bridges the gap between differentiability and power series. It guarantees that if a function behaves well (it is analytic) in a disk, it must also be infinitely differentiable and expressed or representable by a power series (an infinite polynomial) within that disk.

Furthermore, there exist unique constants $a_n = \frac{f^{(n)}(a)}{n!} = \frac{1}{2\pi i}\int_{C_r} \frac{f(w)}{(w-a)^{n+1}}dw$ where $C_r$ is a circle of radius r < R centered at a and oriented in the counterclockwise direction (positively oriented).

Counting the Zeroes of Analytic Functions (The Logarithmic Derivative)

We are going to show how we can use an integral to “count” the order (multiplicity) of a zero. Suppose f is analytic on a disk B(a; r) and has a zero at a of order m. We assume f is not identically zero, so a is an isolated zero.

Using the Structural Definition of a zero, we can write: $f(z) = (z-a)^mh(z), \forall z \in B(a; r)$ where h is analytic (and therefore continuous) and $h(a) \ne 0$. Because $h(a) \neq 0$ and h is continuous, there is a small neighborhood $B(a; \varepsilon)$ where h(z) is never zero.

f’(z) =[Use the Product Rule on $f(z) = (z-a)^m h(z)$] $m(z-a)^{m-1}h(z) + (z-a)^mh'(z), \forall z \in B(a; r)$

We want to analyze the fraction $\frac{f'(z)}{f(z)}$. This is often called the Logarithmic Derivative.

For $z \ne a, z \in B(a; \varepsilon), \frac{f'(z)}{f(z)} = \frac{m}{z-a}+\frac{h'(z)}{h(z)}$

Now, we integrate this ratio around a small circle $C_{\varepsilon_0}$ centered at a with radius $\varepsilon_0 < \varepsilon$, oriented counterclockwise.

$\int_{C_{\varepsilon_0}}\frac{f'(z)}{f(z)} dz = \int_{C_{\varepsilon_0}}\frac{m}{z-a}dz + \int_{C_{\varepsilon_0}} \frac{h'(z)}{h(z)} dz$

$\int_{C_{\varepsilon_0}} \frac{h'(z)}{h(z)} dz$ vanishes because $\frac{h'(z)}{h(z)}$ is an analytic function (h is analytic, $h(z) \neq 0$ in this small circle, therefore the quotient is analytic inside and on the circle) on the contour $C_{\varepsilon_0}$. By Cauchy’s Theorem (integral of an analytic function on a closed loop), this integral is zero.

$\int_{C_{\varepsilon_0}}\frac{f'(z)}{f(z)} dz = m \cdot (2\pi i) \leadsto \frac{1}{2\pi i} \int_{C_{\varepsilon_0}}\frac{f'(z)}{f(z)} dz = m$

Cauchy integral formula: $f(z_{0})=\frac{1}{2\pi i}\oint_{C}\frac{f(z)}{z-z_{0}}dz, \oint_{C} \frac{1}{z-a} dz = 2\pi i$

$m = \frac{1}{2\pi i} \oint_{C_{\varepsilon_0}} \frac{f'(z)}{f(z)} dz$ tells us that integrating $\frac{f'}{f}$ around a zero effectively “scans” the point and “detects” or “counts” the multiplicity; so this formula returns an integer equal to the order of that zero.

Theorem. The Principle of Argument (Counting zeroes). Let f be analytic inside and on a positively oriented closed contour γ. Assume f is never zero on the boundary $\gamma$. Let N be the total number of zeroes inside $\gamma$ including their multiplicities (orders). Then, $\frac{1}{2\pi i} \oint_{C_{\varepsilon_0}} \frac{f'(z)}{f(z)} dz = N$.

Proof.

The function f’/f is analytic inside and on γ except at the zeroes of f lying inside γ. We cannot use Cauchy’s Theorem immediately because these points $a_k$ are “holes” (singularities) in the domain.

Suppose the zeroes are $a_1, a_2, \cdots, a_n$ of orders $m_1, m_2, \cdots, m_n$ respectively inside γ, $N = m_1 + m_2 + \dots + m_n$.

We can find disjoint open disks $\mathbb{B}(a_k, r_k), k = 1, 2, \cdots, n$ such that there is a non-zero function hₖ which is analytic on the disk $\mathbb{B}(a_k, r_k)$ such that $f(z) = (z-a)^{m_k}h_k(z)$

We are using the previous argument. Using the Structural Definition of a zero, we can write: $f(z) = (z-a)^mh(z), \forall z \in B(a; r)$ where h is analytic (and therefore continuous) and $h(a) \ne 0$. Because $h(a) \neq 0$ and h is continuous, there is a small neighborhood $B(a; \varepsilon)$ where h(z) is never zero.

We want to analyze the fraction $\frac{f'(z)}{f(z)}$. This is often called the Logarithmic Derivative.

For $z \ne a_k, z \in B(a_k; r_k), \frac{f'(z)}{f(z)} = \underbrace{\frac{m_k}{z-a_k}}_{\text{The Singular Part}} + \underbrace{\frac{h_k'(z)}{h_k(z)}}_{\text{The Analytic Part}}$

This equation tells us that the “bad behavior” at $a_k$ comes entirely from the term $\frac{m_k}{z-a_k}$. We want to create a new function $F(z)$ that is clean (analytic everywhere inside $\gamma$). To do this, we take our original function and subtract or remove all the singular parts (poles) for every zero.

$F(z) = \frac{f'(z)}{f(z)} - \sum_{j=1}^{n} \frac{m_j}{z-a_j}$

We need to check that $F$ is analytic everywhere inside $\gamma$.

  1. If z is not near any zero; it is far from all $a_j, z \notin \cup_{k=1}^{n} B(a_k,r_k)$. Then, $\frac{f'}{f}$ is analytic, $\frac{1}{z-a_j}$ is analytic (no division by zero), and sum of analytic function is also analytic.
  2. If $z$ is inside the disk of a specific zero $a_k, z \in B(a_k, r_k), 1 \le k \le n$. This is where the magic happens. $F(z) = \frac{f'(z)}{f(z)} - \sum_{j=1}^{n} \frac{m_j}{z-a_j} = \frac{m_k}{z-a_k} + \frac{h_k'(z)}{h_k(z)} - \underbrace{\frac{m_k}{z-a_k}}_{\text{Term } j=k} - \sum_{j \neq k} \frac{m_j}{z-a_j}$

The singular terms $\frac{m_k}{z-a_k}$ cancel each other out! $F(z) = \frac{h_k'(z)}{h_k(z)} - \sum_{j \neq k} \frac{m_j}{z-a_j}$. The first part $\frac{h'}{h}$ is analytic (from previous exercise). The sum part is analytic because we are near $a_k$, so we are far from $a_j, j \neq k$ (the disks have been chosen to be disjoint), so we represent no division by zero. Therefore, F(z) is analytic everywhere inside $\gamma$ and has no holes left.

Since F(z) is analytic everywhere inside and on $\gamma$, Cauchy’s Theorem applies: $\int_\gamma F(z)dz = 0$. Substitute the definition of $F$ back in this formula: $\oint_{\gamma} \left( \frac{f'(z)}{f(z)} - \sum_{j=1}^{n} \frac{m_j}{z-a_j} \right) dz = 0$

Next, we move the sum to the other side (linearity of integrals): $\oint_{\gamma} \frac{f'(z)}{f(z)} dz = \sum_{j=1}^{n} \oint_{\gamma} \frac{m_j}{z-a_j} dz$.

We know the standard integral $\oint_{\gamma} \frac{1}{z-a} dz = 2\pi i$ (as long as the point a is inside the contour $\gamma$), hence $\oint_{\gamma} \frac{f'(z)}{f(z)} dz = \sum_{j=1}^{n} m_j (2\pi i) = 2\pi i \left( \sum_{j=1}^{n} m_j \right) = 2\pi i N \text{ where } \sum_{j=1}^{n} m_j = N$.

Dividing by $2\pi i$ gives $\frac{1}{2\pi i} \oint_{C_{\varepsilon_0}} \frac{f'(z)}{f(z)} dz = N$ Q.E.D.

Rouche’s Theorem. Let f and g be analytic inside and on a contour γ and suppose that strict inequality holds on the boundary, |f(z)| > |g(z)|, $\forall z \in \gamma^*$. Then, f and f + g have the same number of zeroes counting multiplicities inside γ.

Intuitive Idea. Imagine you are an observer standing at the origin (a tree or a “flagpole”). Imagine a person walking a dog around this tree. The person walks along a path f(z). The dog is attached to the person by a leash, the leash is represented by a vector g(z). The dog’s position is the person’s position plus the leash vector, f(z) + g(z). The length of the dog’s leash |g(z)| is always shorter than the person’s distance to the tree (flagpole, origin) $|f(z)|$, then the dog must circle the tree the exact same number of times as the person. The dog cannot get to the other side of the tree to unwind the leash because the leash is too short.

Proof.

Let’s introduce a parameter t that ranges from 0 to 1, t ∈ [0, 1]. Consider the family of functions: $\phi_t(z) = f(z) + t \cdot g(z)$.

At t = 0, we have f(z). At t = 1, we have f(z) + g(z). We need to count the zeroes of $\phi_t(z)$ for each t. Let’s define a counting function h(t) using the Argument Principle:

$h(t) = \frac{1}{2\pi i}\oint_{\gamma} \frac{\phi_t'(z)}{\phi_t(z)}dz = \frac{1}{2\pi i}\oint_{\gamma} \frac{f'(z) + t g'(z)}{f(z) + t g(z)}dz$.

h(t) counts the number of zeroes of $\phi_t(z)$ = (f + tg)(z) inside γ. h(0) is the number of zeroes of f inside γ, h(1) is the number of zeroes of f + g inside γ.

To ensure h(t) is well-defined, the denominator $\phi_t(z)$ must never be zero on the contour $\gamma$. Check the magnitude on the boundary:

$$ \begin{aligned} \forall z \in \gamma^*, |\phi_t(z)| &= |f(z) + t g(z)| \\[2pt] &\text{Using the Reverse Triangle Inequality,} |A+B| \ge |A| - |B| \\[2pt] &\ge |f(z)| - |t||g(z) \\[2pt] &\text{Since t ∈ [0, 1], we know |t| ≤ 1. Besides, |f| > |g|} \\[2pt] &\ge |f(z)| - |g(z)| > 0. \end{aligned} $$

The function $\phi_t(z)$ is never zero on the boundary. The integral exists for all $t \in [0, 1]$.

As we have previously states h(t) calculates the number of zeroes of $\phi_t(z)$ = (f + tg)(z) inside γ, so it must output an integer (1, 2, 3…). If we can prove that h(t) is a continuous function, we are done.

Why? Because the only way an integer-valued function can be continuous is if it is constant. A continuous function must satisfy the Intermediate Value Theorem: if f is continuous on an interval and takes values a and b, then it must take every value between a and b. An integer-valued function only takes values in $\mathbb{Z}$, which are discrete — there are no integers between 2 and 3. In other words, it cannot “jump” from 2 to 3 without passing through 2.5.

Let’s calculate the difference between the count at time t and time s:

$$ \begin{aligned} h(t) - h(s) &= \frac{1}{2\pi i} \int_{\gamma} \left( \frac{f' + tg'}{f + tg} - \frac{f' + sg'}{f + sg} \right) dz \\[2pt] &\text{Combine the fractions (common denominator): Numerator} = (f' + tg')(f + sg) - (f' + sg')(f + tg) = \\[2pt] &f'f + s f' g + t g' f + ts g' g - f'f - t f' g - s g' f - st g' g = (t-s)(g'f - f'g) \\[2pt] &=\frac{t-s}{2\pi i} \int_{\gamma} \frac{g'f - f'g}{(f + tg)(f + sg)} dz. \end{aligned} $$

Bounding the Estimate. We want to show this difference goes to zero as $s \to t$.

  1. Bounds for the Top: Since $f, g, f', g'$ are continuous on the compact contour $\gamma$, the term |g’f - f’g| is bounded by some constant M.
  2. Bounds for the Bottom: Previously we have shown that |f(z) + t g(z)| > 0, then $|f+tg| \ge m$ for some m > 0. Besides, $|g| \le M'$. If we choose s close enough to t (specifically $|s-t| \le \frac{m}{2M'}$), then by triangle inequality: $|f+sg| = |f+tg+(s-t)g| \ge |f+tg| - |s-t||g| \ge m - \frac{m}{2M'}\cdot M' = m - \frac{m}{2} = \frac{m}{2}$

Extreme Value Theorem. If $f: K \rightarrow \mathbb{R}$ is continuous and $K\subseteq \mathbb{R^{\mathnormal{n}}}$ is compact (i.e. closed and bounded), then f attains both a maximum and a minimum value on K.

Therefore (Top $\le M$, Bottom $\ge m \cdot \frac{m}{2}$), $|h(t) - h(s)| \le \frac{|t-s|}{2\pi} \cdot \frac{M}{m \cdot (m/2)} \cdot \text{Length}(\gamma) = C \dot |t-s|$. As $s \to t$, the right side goes to 0 (Squeeze Theorem, $0 \le |h(t) - h(s)| \le C \dot |t-s|$). Therefore, $h(t)$ is continuous.

In conclusion, (1) h(t) is a function from [0, 1] to the Integers $\mathbb{Z}$ (it counts the zeroes of of $\phi_t(z)$ = (f + tg)(z) inside γ); (2) h(t) is continuous; (3) A continuous integer-valued function must be constant (Intermediate Value Theorem), hence (4) h(0) = h(1) where h(0) counts the number of zeroes of f and h(1) counts the number of zeroes of f + g. Therefore, f and f + g have the same number of zeroes Q.E.D.

Examples

Inside the unit circle (|z|< 1), powers like $z^5$ are small. Define f(z) = 3z and g(z) = $z^5 + 1$.

On the circle |z| = 1: $|f(z)| = |3z| = 3 > |g(z)|$ because $|g(z)| = |z^5 + 1| \le |z|^5 + 1 = 1 + 1 = 2$.

By Rouche’s Theorem, f + g = P(z) has the same number of zeroes as f(z) = 3z, 3z has 1 zero (z = 0, the origin). Therefore, $z^5 + 3z + 1$ has exactly 1 zero inside the unit disk |z| < 1.

Let’s define the functions $f(z) = 2 + z^2, g(z) = -e^{iz}$, hence H(z) = f(z) + g(z). 💡Polynomials (like $z^2$) grow very fast as z gets large. We expect $z^2$ to dominate everything else..

Our contour $\gamma$ has two parts. The Segment $[-R, R]$ (the real axis) and the upper half semicircle with center 0 and radius R positively oriented ($C_R, z = Re^{i\theta}, 0 \le \theta \le \pi$) plus the segment [-R, R].

  1. On the segment [-R, R] (x-axis, z = x and x is between $-R$ and $R, x^2 \ge 0$), $|f(z)| = |2 + x^2| \ge 2 \gt 1 = |g(z)|$

    $z = x, x^2 \ge 0, |2 + x^2| \ge 2.$ Besides, $|g(z)| = |-e^{ix}| = |e^{ix}|= \sqrt{\cos^2 x + \sin^2 x} = 1$

  2. On the upper half semicircle, |z| = R, $z = Re^{i\theta} = R\cos(\theta)+iR\sin(\theta)$ where z = x + iy, y ≥ 0 (upper half semicircle) $y = Rsin(\theta) \ge 0, |g(z)| = |-e^{i(R\cos(\theta) + iR\sin(\theta))}| = |-e^{-R\sin(\theta)+iR\cos(\theta)}| = e^{-R\sin(\theta)} \le e^0 = 1$. $|f(z)| \ge[\text{Reverse Triangle Inequality: } |A+B| \ge |A| - |B|] |z^2| - |2| = R^2 - 2 \gt 1$ as long as we pick a radius R big enough, more precisely $R^2 - 2 \gt 1 ↔ R \gt \sqrt{3} \approx 1.73$

Since $|f(z)| > |g(z)|$ on the entire boundary (segment + arc), Rouché’s Theorem tells us that the number of zeroes of f(z) + g(z) = H(z) inside the contour equals the number of zeroes of f(z) inside the contour.

Only the positive root $z_1 = i\sqrt{2}$ lies in the upper half plane. Its magnitude is $|z_1| = \sqrt{2} \approx 1.414$. Since we chose our radius $R > \sqrt{3} \approx 1.73$, we have $|z_1| < R$, which places the root strictly inside our contour. Therefore, for any sufficiently large R, H(z) has exactly one zero inside the contour. Taking the limit as $R \to \infty$, we conclude that $H(z)$ has exactly one zero in the entire Upper Half Plane.

$z^2 + 2 = 0 \leadsto z^2 = -2 \leadsto z = \pm \sqrt{-2} = \pm i\sqrt{2}$.

Bitcoin donation

JustToThePoint Copyright © 2011 - 2025 Anawim. ALL RIGHTS RESERVED. Bilingual e-books, articles, and videos to help your child and your entire family succeed, develop a healthy lifestyle, and have a lot of fun. Social Issues, Join us.

This website uses cookies to improve your navigation experience.
By continuing, you are consenting to our use of cookies, in accordance with our Cookies Policy and Website Terms and Conditions of use.