JustToThePoint English Website Version
JustToThePoint en español

Analytic Functions in Complex Analysis

As long as algebra is taught in school, there will be prayer in school, Cokie Roberts

Anyone attempting to generate random numbers by deterministic means is, of course, living in a state of sin, John Von Neumann

image info

Analytic functions

Definition. A complex function f is said to be analytic analytic (or holomorphic) at a point $z_0$ if it satisfies any of the following equivalent conditions:

  1. Differentiable in a neighborhood. There exists an open neighborhood U of $z_0$ such that f is complex differentiable at every point in U. Complex differentiability means that the derivative $f'(z)$ exists for all $z \in U$ in the complex sense: $f'(z) = \lim_{h \to 0} \frac{f(z+h)-f(z)}{h}.$

    Here the limit must be the same regardless of the direction from which h approaches zero in the complex plane. This is the crucial distinction from real differentiability: the increment $h$ is a complex number that can approach zero along any path — horizontal, vertical, spiral, or otherwise.

  2. Open disc criterion. There exists some radius r > 0 such that f is complex differentiable at every point in the open disc B(z; r) = $\\{z : |z - z_0| < r\\}$. This is a specific case of Condition 1, since an open disc is just a particular or specific type of open neighborhood around $z_0$.
  3. Power series expansion. The function can be locally expressed as a convergent power series:$f(z) = \sum_{n = 0}^\infty a_n(z -z₀)^n$ in some neighborhood of $z_0$. This series converges absolutely and uniformly on compact subsets within its disk of convergence.

Recall.

While it is a straightforward consequence that a function defined by a convergent power series is complex differentiable (one can differentiate term by term), the converse is a profound and remarkable result. This equivalence underscores the extraordinary strength of complex differentiability.

The converse means that every function that is complex differentiable in an open set automatically admits a power series expansion about each point of that set. It automatically guarantees thatthe function is infinitely differentiable and can be locally expressed as a power series.

A function is defined as holomorphic (analytic) on an open set $U \subseteq \mathbb{C}$if it is complex differentiable at every point in U.

The concept of analyticity is a cornerstone of complex analysis, representing a significant strengthening of the notion of differentiability. While a function of a real variable can be differentiable at a single point without being differentiable in any neighborhood of that point, a function of a complex variable that is analytic at a point must be differentiable in an open set containing that point (it requires differentiability in an entire neighborhood). This is what gives analytic functions their remarkable properties.

Definition. A function f which is analytic (holomorphic) on the entire complex plane $\mathbb{C}$ is called an entire function.

Properties

Analytic (holomorphic) functions enjoy many powerful properties and obey strict rules. Below are some of the most important properties, assuming f and g are analytic in some domain (open connected set) D:

A rational function $r(z) = \frac{p(z)}{q(z)}$ (with $p$ and $q$ polynomials) is holomorphic on $\mathbb{C} \setminus \{z : q(z) = 0\}$.

The points where $q(z) = 0$ are called poles of $r$. For example, $f(z) = \frac{1}{z^2 + 1}$ is analytic for all $z$ except at $z = i$ and $z = -i$ (the zeros of the denominator), which are poles of order one (simple poles).

Formulation 1. Zeros of an Analytic Function. Let $D \subset \mathbb{C}$ be a domain (an open connected set), and $f : D \to \mathbb{C}$ be an analytic function. If there exists an infinite sequence $\{z_k\} \subset D$, such that:

  1. $f(z_k) = 0$ for all $k \in \mathbb{N}$,
  2. $z_k \to z_0 \in D$ (i.e., the sequence converges to a point in D). Then, $f(z) = 0$ for all $z \in D$. If an analytic function vanishes on a set with an accumulation point in $D$, it must be identically zero on all of $D$.

Formulation 2. Equality of Two Analytic Functions.Let f(z) and g(z) be analytic in a domain D. If the set $\{z \in D : f(z) = g(z)\}$ has a limit point in D (i.e., there exists a convergent sequence $z_k \to z_0 \in D$ with $f(z_k) = g(z_k)$), then f(z) = g(z) for all $z \in D$. Two analytic functions that agree on a set with a limit point must be identical throughout D.

Example: Suppose f is analytic in ℂ (entire) and $f(\frac{1}{n}) = 0$ for all $n \in \mathbb{N}$. Since $\frac{1}{n} \to 0$ and $0 \in \mathbb{C}$, the Identity Theorem implies f(z) = 0 everywhere.

Proof (Formulation 1): Since f is analytic at $z_0$, it has a power series expansion in some neighborhood $B_r(z_0) \subset D$: $f(z) = \sum_{n = 0}^∞ a_n(z - z_0)^n$.

Our goal is to show that all coefficients aₙ = 0, forcing $f \equiv 0$ near $z_0$.

Since $f(z_k) = 0$ and $z_k \to z_0$, continuity of $f$ gives $f(z_0) = \lim_{k \to \infty} f(z_k) = 0$. But $f(z_0) = a_0$, so $a_0 = 0$. This is the “base case” of the induction.

Inductive argument for higher coefficients. Assume $a_0 = a_1 = \dots = a_{m-1} = 0$. Then, the power series becomes: $f(z) = \sum_{n = m}^∞ a_n(z - z_0)^n = (z-z_0)^m\sum_{n = 0}^∞ a_{n+m}(z - z_0)^n$

Define g(z) = $\sum_{n = 0}^∞ a_{n+m}(z - z_0)^n$, which is analytic in $B_r(z_0)$

Now evaluate at the zeros $z_k$: Since $f(z_k) = 0$ and $z_k \neq z_0$ for sufficiently large $k$ (because $z_k \to z_0$ but the $z_k$ are distinct from $z_0$), we can factor: $0 = f(z_k) = (z_k - z_0)^m g(z_k)$. Since $(z_k - z_0)^m \neq 0$, we conclude $g(z_k) = 0$ for all sufficiently large $k$. In other words, the zeros of $f$ at the $z_k$ propagate to zeros of $g$.

A convergent sequence in a domain cannot eventually equal its limit unless it is eventually constant, and we are stating the points $z_k$ are assumed to be distinct zeros of f.

Taking $k \to \infty$, continuity of $g$ gives $g(z_0) = a_m = 0$. By induction, $a_n = 0$ for all $n$. Since every coefficient $a_n=0$, the power series is identically zero in some disk around $z_0$, so $f \equiv 0$ in $B_r(z_0)$. This proves the function is zero locally.

Extend to the entire domain D. Define the set $A = \{z \in D : f \equiv 0 \text{ in some neighborhood of } z\}.$ A is open (by definition, if $f \equiv 0$ near one point $z$, then $f \equiv 0$ near nearby points), non-empty ($z_0 \in A$), and closed. If $w_k \in A$ and $w_k \to w \in D$, then $f(w_k) = 0$ for all $k$, so $w$ is an accumulation point of zeros of $f$ in $D$. Repeating the power series argument at $w$ shows $f \equiv 0$ near $w$, hence $w \in A$.

Since $D$ is connected, the only subset of $D$ that is simultaneously non-empty, open, and closed is $D$ itself. Therefore $A = D$, and $f \equiv 0$ everywhere in $D$. $\blacksquare$

Corollary (Equality of Two Analytic Functions) If f and g are analytic in D and agree on a set with a limit point in D, then h(z) = f(z) − g(z) is analytic and vanishes on that set. By the Identity Theorem (Formulation 1), $h \equiv 0$, so $f \equiv g$ on $D$. $\blacksquare$

Proof:

Let $z_0 \in D$ and $w_0 = f(z_0)$. We must show that $h(z) = g(f(z))$ is differentiable at $z_0$ and compute its derivative. We consider two cases.

Case 1. $f(z) \neq f(z_0)$ for all $z$ in some punctured neighborhood of $z_0$.

In this scenario, for z sufficiently close to z₀ (but $z \neq z_0$), the term f(z) − f(z₀) is non-zero. This allows for a strategic manipulation of the difference quotient by multiplying and dividing by f(z)−f(z₀):

$ \frac{g(f(z)) - g(f(z_0))}{z - z_0} = \frac{g(f(z)) - g(f(z_0))}{f(z) - f(z_0)} \cdot \frac{f(z) - f(z_0)}{z - z_0}$

Now, we consider the limit as $z \to z_0$. The continuity of $f$ (which follows from its differentiability) ensures $f(z) \to f(z_0) = w_0$. Setting w = f(z), the limit of the first factor becomes $\frac{g(w) - g(w_0)}{w - w_0}$. This is precisely the definition of the derivative of g at $w_0$, which is $g'(w_0) = g'(f(z_0))$, since g is analytic at $w_0$. The second factor directly corresponds to the definition of the derivative of f at z₀, which is f′(z₀), since f is analytic at z₀.

Therefore, in this case, the limit exists and is given by the product: $\lim_{z \to z_0} \frac{g(f(z)) - g(f(z_0))}{f(z) - f(z_0)} \cdot \frac{f(z) - f(z_0)}{z - z_0} = g'(f(z_0))f'(z_0)$.

Case 2. $f(z) = f(z_0)$ for infinitely many $z$ accumulating at $z_0$.

This situation implies that z₀ is an accumulation point for the set of zeros of the function k(z) = f(z) - f(z₀). Since f is an analytic function, k(z) is also analytic (as the difference of two analytic functions). The Identity Theorem states that if an analytic function has an accumulation point of zeros within its domain, then the function must be identically zero throughout the connected component containing that accumulation point.

Consequently, $k \equiv 0$ in a neighborhood of $z_0$, meaning $f(z) = f(z_0)$ for all $z$ near $z_0$. In particular, $f$ is constant near (or in a neighborhood of) $z_0$, so its derivative must be zero, $f'(z_0) = 0$.

In this scenario, for z ≠ z₀ within this neighborhood $\frac{h(z) - h(z_0)}{z - z_0} = \frac{g(f(z)) - g(f(z_0))}{z - z_0} = \frac{g(f(z_0)) - g(f(z_0))}{z - z_0} = 0.$ Thus, the difference quotient becomes $\frac{0}{z-z_0} = 0$. Taking the limit as $z \to 0$, we find that $h'(z_0) = 0$. Crucially, the formula derived in Case 1, $g'(f(z_0))f'(z_0)$ also yields $g'(f(z_0))·0 = 0$, demonstrating consistency across both cases.

For example, $f(z)=z^3$ is entire and one-to-one on the domain $D = \{z \in \mathbb{C} : z \neq 0 \text{ and } |\arg(z)| < \pi/3\}$. This angular restriction is necessary because the cube roots of unity — $1$, $\omega = e^{2\pi i/3}$, $\omega^2 = e^{4\pi i/3}$ — are spaced $120°$ apart; if the domain included more than $2\pi/3$ of angular width, there would exist distinct points $z_1, z_2 \in D$ with $z_1^3 = z_2^3$, violating injectivity. We exclude $z = 0$ because $f'(0) = 3 \cdot 0^2 = 0$, violating the non-vanishing derivative condition.

Since f is a polynomial, it’s analytic everywhere. Besides, $f'(z) = 3z^2 \neq 0$ never vanishes on this domain, so the inverse functon $f^{-1}(w) = w^{1/3}$ (choosing the appropriate branch) is analytic on $f(D)$, with derivative: $\frac{d}{dw} w^{1/3} = \frac{1}{3(w^{1/3})^2} = \frac{1}{3} w^{-2/3}.$

These closure properties — under arithmetic operations, composition, and inversion — make the class of analytic functions a very robust and well-behaved collection of objects. In algebraic language, the analytic functions on a domain form a ring (under addition and multiplication) and even an integral domain (by the Identity Theorem, if $f \cdot g \equiv 0$ and $f \not\equiv 0$, then $g \equiv 0$).

Examples of Analytic Functions

Let $f(z) = z^2 + 3z + 2$. Writing $z = x + iy$: $f(z)=z^2+3z+2 = (x+iy)^2+3(x+iy)+2 = (x²-y² + 3x +2) + i(2xy + 3y)$. So $u(x, y) = x^2 - y^2 + 3x + 2$ and $v(x, y) = 2xy + 3y$.

Let’s compute the partial derivatives: $\frac{\partial u}{\partial x} = 2x+3, \frac{\partial u}{\partial y} = -2y, \frac{\partial v}{\partial x} = 2y, \frac{\partial v}{\partial y}=2x+3$

Since the Cauchy–Riemann equations are satisfied and the partial derivatives are continuous everywhere, the function f(z) is analytic everywhere in the complex plane, i.e., is entire.

$$e^z = \sum_{n=0}^{\infty} \frac{z^n}{n!}, \qquad \cos z = \sum_{n=0}^{\infty} (-1)^n \frac{z^{2n}}{(2n)!}, \qquad \sin z = \sum_{n=0}^{\infty} (-1)^n \frac{z^{2n+1}}{(2n+1)!}.$$

Since each series has infinite radius of convergence ($R = \infty$), these functions are analytic on all of $\mathbb{C}$. Their derivatives follow the same formulas as in real calculus: $(e^z)' = e^z$, $(\sin z)' = \cos z$, $(\cos z)' = -\sin z$.

For example, $h(z) = \frac{1}{z² + 1}$. The inner function is f(z) = z² + 1. This is a polynomial, and hence entire. The outer function if g(w) = $\frac{1}{w}$. This function is analytic for all $w \neq 0$. For the composite function h(z) = g(f(z)) to be well-defined and analytic, we need $f(z) \neq 0$, i.e., $z^2 + 1 \neq 0$, which fails only at $z = \pm i$. Therefore $h(z)$ is analytic on $\mathbb{C} \setminus \{i, -i\}$.

By the chain rule: $h'(z) = g'(f(z)) \cdot f'(z) = \frac{-1}{(z^2 + 1)^2} \cdot 2z = \frac{-2z}{(z^2 + 1)^2}.$

The points $z=\pm i$ are poles (isolated singularities) of h; everywhere else, h has well-defined complex derivatives of all orders. This illustrates how rational functions, though not entire, are locally analytic wherever they are defined.

The range of f and g are the entire complex plane. Since $f(\mathbb{C}) \subseteq \text{domain}(g) = \mathbb{C}$, the domain compatibility condition is met. Therefore, by the composition theorem, h(z) = g(f(z)) = $e^{z^2}$ is an entire function. The derivative is obtained by the chain rule: $h'(z) = e^{z^2} \cdot 2z = 2z e^{z^2}.$

Example 2. $h(z) = \cos(z^2 + 1)$ is entire. The inner function $f(z) = z^2 + 1$ and outer function $g(w) = \cos w$ are both entire (a polynomial and the cosine functions are both entire). The range of f and g are the entire complex plane. Since $f(\mathbb{C}) \subseteq \text{domain}(g) = \mathbb{C}$, the domain compatibility condition is met, too. By the chain rule: $h'(z) = -\sin(z^2 + 1) \cdot 2z = -2z\sin(z^2 + 1).$

However, if we restrict the domain to exclude a ray from the origin — for instance, the negative real axis — then $\log z$ becomes a single-valued analytic function on the resulting cut plane $\mathbb{C} \setminus \{x \leq 0 : x \in \mathbb{R}\}$. This particular choice gives the principal branch of the logarithm: $\text{Log}(z) = \ln|z| + i\,\text{Arg}(z), \qquad \text{Arg}(z) \in (-\pi, \pi)$ which is analytic on that cut-plane.$

The derivative of any analytic branch of the logarithm is derived by differentiating $e^{\log z} = z$ using the chain rule: $e^{\log z} \cdot \frac{d}{dz}\log z = 1 \quad \Longrightarrow \quad \frac{d}{dz}\log z = \frac{1}{z}.$

Example. $h(z) = \log(z^2 + 1)$ using the principal branch of $\log$. The inner function is $f(z) = z^2 + 1$. This is a polynomial, and thus an entire function. The outer function is $g(w) = \text{Log}(w)$ is analytic on $\mathbb{C} \setminus (-\infty, 0]$.

For $h(z)$ to be well-defined and analytic, we must both choose a branch for $\log$ and ensure $f(z)$ does not hit the branch cut or the singularity of $\log$. Suppose we use the principal branch of $\log$ (cut along $(-\infty,0]$ on the real axis), we need $z^2 + 1 \notin (-\infty, 0]$, i.e., $z^2 + 1$ must not be a non-positive real number.

Let $z = x + iy$, so $z^2 + 1 = (x^2 - y^2 + 1) + 2ixy$. This lies on the non-positive real axis if and only if $\text{Im}(z^2 + 1) = 2xy = 0$ (so $x = 0$ or $y = 0$), and also $\text{Re}(z^2 + 1) = x^2 - y^2 + 1 \leq 0$.

Case $y = 0$: $z^2 + 1 = x^2 + 1 \geq 1 > 0$. This never lies on $(-\infty, 0]$. ✓

Case $x = 0$: $z^2 + 1 = -y^2 + 1$. This is non-positive when $y^2 \geq 1$, i.e., $|y| \geq 1$. ✗

Therefore $h(z)$ is analytic on $\mathbb{C} \setminus \{iy : |y| \geq 1\}$. On this domain, h has a well-defined derivative given by the chain rule $h'(z) = \frac{1}{z^2 + 1} \cdot 2z = \frac{2z}{z^2 + 1}.$

Counterexamples (Non-Analytic Functions)

Not every complex function is analytic. In fact, complex-differentiability is a very stringent condition, and many functions that are perfectly well-behaved from a real-variable standpoint fail to be holomorphic.

$u(x,y)=x^2+y^2$ and $v(x,y)=0$ give $\frac{\partial u}{\partial x} = 2x$, $\frac{\partial v}{\partial y} = 0$, which are only equal at x = 0; likewise $\frac{\partial u}{\partial y} = 2y$, $\frac{\partial v}{\partial x} = 0$, only equal at $y=0$. So the only point where Cauchy–Riemann equations hold is $(0,0)$, and f is complex-differentiable at $z=0$.

One can also check directly (Verification via the limit definition): $f'(z) = lim_{h \to 0} \frac{f(z+h)-f(z)}{h} = lim_{h \to 0} \frac{|z+h|^2-|z|^2}{h} =lim_{h \to 0} \frac{|z|^2 + z\bar h + \bar zh + |h|^2-|z|^2}{h} = lim_{h \to 0} z·\frac{\bar h}{h} + \bar z + \frac{h\bar h}{h} = lim_{h \to 0} z·\frac{\bar h}{h} + \bar z$

Recall: $|z+h|^2 = (z + h)\overline{z + h} = (z + h)(\bar z + \bar h) = |z|^2 + z\bar h + \bar zh + |h|^2$

  1. If $h = t \in \mathbb{R}$, $lim_{h \to 0} z·\frac{\bar h}{h} + \bar z = z·1 + \bar z = 2\text{Re}(z)$
  2. If $h = it$, $t \in \mathbb{R}$, $lim_{h \to 0} z·\frac{\bar h}{h} + \bar z = z·(-1) + \bar z = -2i,\text{Im}(z)$. Note: $- z + \overline{z} = - (x + iy) + (x - iy) = -x - iy + x - iy = -2iy$.
  3. Conclusion: These are equal only when $z = 0$, confirming that $f$ is complex-differentiable at $z = 0$ alone (with $f'(0) = 0$).

Why $f$ is not analytic at $z = 0$. Despite being complex-differentiable at the origin, $|z|^2$ is not analytic there. In every neighborhood of $0$, no matter how small, there exist points where the derivative does not exist (in fact, every nonzero point). Analyticity requires differentiability throughout an open neighborhood, not merely at a single point.

This example underscores that being differentiable at an isolated point (as $|z|^2$ is at 0) is not sufficient for analyticity; analyticity demands differentiability in a full neighborhood. In summary, $|z|^2$ is not analytic on any open set (it fails to meet the criteria of the definition), even though it is differentiable at the single point $0$.

These examples reinforce a key theme: any function that extracts (the function’s output depens on) “real-variable” information from $z$ — its real part, imaginary part, modulus, or conjugate — is typically not analytic. This is because these operations break down the complex number z into its real and imaginary parts, treating them separately rather than as an indivisible complex entity (as a whole).

The properties of the complex exponential — which is an entire function, periodic with period $2\pi i$, and surjective onto $\mathbb{C} \setminus \{0\}$ — allow us to solve equations that have no real solutions or that have more solutions than one might expect.

  1. First, express 2 + 2i in polar form. $r =|eᶻ| = \sqrt{2^2 + 2^2} = \sqrt{8} = 2\sqrt{2}, tan(\theta) = \frac{2}{2} = 1$. Since the point (2, 2) is in the first quadrant, the principal argument is $\theta = \frac{\pi}{4}, \text{so } 2 + 2i = 2\sqrt{2}e^{iπ/4}$.
  2. Writing $z = x + iy$ and using $e^z = e^x e^{iy}$, we need: $e^x = 2\sqrt{2}, \qquad y = \frac{\pi}{4} + 2k\pi \quad (k \in \mathbb{Z}).$
  3. Simplifying this, we get $x = \ln(2\sqrt{2}) = \ln(2^{3/2}) = \frac{3}{2}\ln 2$.
  4. Thus, the solutions are: $z = \frac{3}{2}ln(2) + i(π/4 + 2kπ)$, for any integer k.

There are infinitely many solutions, equally spaced along a vertical line in the complex plane at $x = \frac{3}{2}\ln 2$. This infinite multiplicity reflects the $2\pi i$-periodicity of $e^z$.

Bitcoin donation

JustToThePoint Copyright © 2011 - 2026 Anawim. ALL RIGHTS RESERVED. Bilingual e-books, articles, and videos to help your child and your entire family succeed, develop a healthy lifestyle, and have a lot of fun. Social Issues, Join us.

This website uses cookies to improve your navigation experience.
By continuing, you are consenting to our use of cookies, in accordance with our Cookies Policy and Website Terms and Conditions of use.