JustToThePoint English Website Version
JustToThePoint en español

Continuity

Now is the time for us to shine. The time when our dreams are within reach and possibilities vast. Now is the time for all of us to become the people we have always dreamed of being. This is your world. You’re here. You matter. The world is waiting, Haley James Scott.

image info

Continuity

Why Continuity Matters

Continuity is not a mere technicality —it is the backbone and foundation upon which analysis is built. It formalizes the intuitive principle that “small changes in input produce small changes in output (small perturbations in the input should not cause wild jumps in the output)” and enables the transfer of geometric/topological intuition into rigorous mathematics. In complex analysis, continuity serves as:

  1. A prerequisite for differentiability: Every differentiable function is continuous (but importantly not conversely!)
  2. A necessary condition for integration: We integrate continuous functions over curves. For a contour $\gamma: [a, b] \to \mathbb{C}$ (piecewise smooth) and f continuos on $\gamma([a, b])$, the integral $\int_{\gamma}f(z)dz = \int_{a}^{b} f(\gamma'(t))\gamma'(t)dt$ exists as a Riemann integral.
  3. A bridge between topology and analysis: The ε-δ definition of continuity in metric spaces is equivalent to the topological definition that f is continuous iff the preimage of every open set is open.
  4. A minimum standard for “reasonable” functions: Continuity filters out pathological behavior.

With it, we gain a minimal coherent framework where deeper structures (differentiability, analyticity, conformality) can be meaningfully defined and studied.

Definition (Limit-based Continuity). A function $f: U \to \mathbb{C}$ is continuous at a point a ∈ U if $\lim_{z \to a} f(z) = f(a)$

This requires three conditions:

  1. f(a) is defined (i.e., a is the domain U and f maps a to $f(a) \in \mathbb{C}$).
  2. The limit $\lim_{z \to a} f(z)$ exists.
  3. The limit equals f(a).

    This aligns with the intuitive idea that “small changes in input near a produce small changes in output near f(a).”

The formal definition translates this intuition into mathematical precision:

Formal definition (Epsilon-Delta). f is continuous at a if and only if: $\forall \varepsilon > 0, \; \exists \delta > 0 : |z - a| < \delta \Rightarrow |f(z) - f(a)| < \varepsilon$.

In words, For any desired closeness ε in the output, we can find a corresponding closeness δ in the input that guarantees it.

Unpacking the Definition

Definition (Sequential Continuity). Let D ⊆ ℂ and f: D → ℂ. The function f is continuous at a ∈ D if for every sequence ${z_n}$ in D with $z_n \to a: \lim_{n \to \infty} f(z_n) = f(a)$

Equivalence of Definition. The ε-δ definition and sequential definition are equivalent.

Proof.

(ε-δ ⟹ Sequential): Suppose f is ε-δ continuous at a. Let ${z_n}$ be a sequence in D with $z_n \to a$.

Given ε > 0, there exists δ > 0 such that |z - a| < δ ⟹ |f(z) - f(a)| < ε.

Since $z_n \to a$, there exists N such that n ≥ N $\implies |z_n - a| < δ$.

For n ≥ N: |zₙ - a| < δ, hence |f(zₙ) - f(a)| < ε. Therefore, $f(z_n) \to f(z_0) \blacksquare$.

(Sequential ⟹ ε-δ): We prove the contrapositive.

Suppose f is NOT ε-δ continuous at a. Then, $\exists \varepsilon_0 > 0 : \forall \delta > 0, \exists z \in D \text{ with } |z - a| < \delta \text{ but } |f(z) - f(a)| \geq \varepsilon_0$

For each n ∈ ℕ, take δ = 1/n. There exists $z_n \in D$ with (i) $|z_n - a| < 1/n$; and (ii) $|f(z_n) - f(a)| \geq \varepsilon_0$

Then, $z_n \to a$ but $f(z_n) \not\to f(a)$, so sequential continuity fails. ∎

Definition (Uniform Continuity on a Set). f is continuous on U if it is continuous at every point of U. If U is compact (closed and bounded), continuous functions on U are uniformly continuous (a stronger condition where δ depends only on ε, not the point a).

Let $f: U \subseteq \mathbb{C} \to \mathbb{C}$ (or $\mathbb{R} \to \mathbb{R}$).

  1. Pointwise Continuity on U: $\forall a \in U, \forall \varepsilon > 0, \exists \delta = \delta(a, \varepsilon) > 0 : |z - a| < \delta \implies |f(z) - f(a)| < \varepsilon$
  2. Uniform Continuity on U: $\forall \varepsilon > 0, \exists \delta = \delta(\varepsilon) > 0 : \forall a, z \in U, |z - a| < \delta \implies |f(z) - f(a)| < \varepsilon$

    The critical difference: In uniform continuity, $\delta$ is chosen before knowing which points are involved.

Let $f: (0, 1] \to \mathbb{R}$, $f(x) = \frac{1}{x}$.

  1. To prevent $f(x) = \frac{1}{x}$ from “blowing up” near x = 0, we restrict x to be at least $\frac{a}{2}$.
    By choosing $\delta \le \frac{a}{2}$, we ensure: $|x - a| \lt \frac{a}{2} \implies -\frac{a}{2} \lt x - a \lt \frac{a}{2} \implies x \gt a -\frac{a}{2} = \frac{a}{2}$.
    This guarantees $x \gt\frac{a}{2}$, so $\frac{1}{x} \lt \frac{2}{a}$, avoiding unbounded behavior.
  2. The difference $|\frac{1}{x}-\frac{1}{a}| = \frac{|x-a|}{|xa|} \lt \frac{|x-a|}{(\frac{a}{2})a} = \frac{2|x-a|}{a^2} \lt \varepsilon \implies |x -a| \lt \frac{a^2\varepsilon}{2}$.
  3. Both conditions must hold. Taking the minimum ensures both are satisfied: $\delta = min(\frac{a}{2}, \frac{a^2\varepsilon}{2})$
  4. At $a = 1$, $\delta \approx \varepsilon$ works. At $a = 0.01 = 10^{-2}$, we need $\delta \approx 10^{-4}\varepsilon$. At $a = 0.001$, we need $\delta \approx 10^{-6}\varepsilon$.
  5. As $a \to 0^+$, the required $\delta$ shrinks to zero (proportionally to $a^2\varepsilon$). No single $\delta > 0$ works for all $a \in (0, 1]$.

Conclusion: $f(x) = 1/x$ is continuous on $(0, 1]$ but not uniformly continuous.

The problem: $(0, 1]$ is not compact -0 is missing.

Heine-Cantor Theorem. Let $K \subset \mathbb{C}$ be a compact set. If a function $f: K \to \mathbb{C}$ is continuous on K, then $f$ is uniformly continuous on $K$.

Proof (by Contradiction)

  1. Assume $f$ is continuous but not uniformly continuous on compact $K$.
  2. This means there exists a “bad” $\varepsilon_0 > 0$ for which no $\delta$ works. Formally, there exists $\varepsilon_0 > 0$ such that $\forall \delta > 0$, there are points $a, z \in K$ with: $|z - a| < \delta \quad \text{but} \quad |f(z) - f(a)| \geq \varepsilon_0$
  3. Let’s construct a “bad” sequence.
    Since the condition above fails for every $\delta > 0$, we can choose a sequence of $\delta$’s getting smaller and smaller to force the points together.
    Take $\delta = \frac{1}{n}$ for $n = 1, 2, 3, \ldots$. This yields sequences $\{a_n\}, \{z_n\} \subset K$ such that: $|z_n - a_n| < \frac{1}{n} \quad (\star) \quad \text{and yet} \quad |f(z_n) - f(a_n)| \geq \varepsilon_0 (\star\star)$
  4. Extract convergent subsequences Since $K$ is compact (specifically, sequentially compact), every sequence in $K$ has a subsequence that converges to a limit in K.
    So we can extract from $\{a_n\}$ a convergent subsequence: $a_{n_k} \to L \in K$ as $k \to \infty$.
  5. We claim that the corresponding subsequence $\{z_{n_k}\}$ also converges to the same limit $L$: $z_{n_k} \to L$. By the Triangle Inequality, $|z_{n_k} - L| \leq |z_{n_k} - a_{n_k}| + |a_{n_k} - L| <[(\star)] \frac{1}{n_k} + |a_{n_k} - L| \to[a_{n_k} \to L; \text{As } k\to \infin, \frac{1}{n_k} \to 0] 0 + 0 = 0$. Therefore, $|z_{n_k} - L| \to 0$, so $z_{n_k} \to L$.
  6. Since $f$ is continuous at the point $L \in K$: $f(a_{n_k}) \to f(L) \quad \text{and} \quad f(z_{n_k}) \to f(L)$. This implies that the distance between their images must vanish: $\lim_{k \to \infty} |f(z_{n_k}) - f(a_{n_k})| = |f(L) - f(L)| = 0$
  7. Compare this result with our construction in Step 3 $(\star\star)$: (i) From Continuity: $|f(z_{n_k}) - f(a_{n_k})| \to 0$; (ii) From Construction: $|f(z_{n_k}) - f(a_{n_k})| \geq \varepsilon_0$ for all $k$.
    These two statements cannot both be true. The distance cannot approach 0 if it is always stuck above a positive number $\varepsilon_0$.
  8. Conclusion: $f$ must be uniformly continuous. $\blacksquare$

Sequential Characterization Theorem. f is continuous at a if and only if for every sequence $\{z_n\}$ in U with $z_n \to a$, we have $f(z_n) \to f(a)$, e.g., if $f(z) = z^2$, is continuous at a, then for $z_n = a + \frac{1}{n}, f(z_n) = (a + \frac{1}{n})^2 \to a^2 = f(a)$

Properties of Continuous Functions

Theorem. If f and g are continuous at a, then so are:

Theorem (Composition). If f is continuous at a and g is continuous at f(a), then $g \circ f$ is continuous at a.

Component-wise Continuity Theorem. Let f(z) = u(x, y) + iv(x, y). Then, f is continuous at $a = \alpha + i\beta$ if and only if if the real-valued functions u and v are continuous at $(\alpha, \beta)$ in ℝ². Complex continuity reduces to continuity of real and imaginary parts in the plane.

Examples and Counterexamples

  1. Continuous Function: $f(z) = z$ is continuous everywhere. f(z)= e^z$ is also continuous everywhere.
  2. Discontinuous Function: $f(z) = \begin{cases} 1, &z = 0 \\\\ 0, &\text{otherwise} \end{cases}$ $lim_{z \to 0}f(z) = 0 \ne f(0) = 1$
  3. Removable Discontinuity: $f(z) = \frac{sin(z)}{z}$ at z = 0. Defining f(0) = 1 makes it continuous.

Remarks

Definition. A metric space is a set M equipped with a distance or metric function d: $M × M \to \mathbb{R}$ that defines a construct of distance between any two points in the set. Basically, it satisfies the following four axioms. ∀x, y, z ∈ M (for any three points in the set):

  1. Non-negativity: d(x, y) ≥ 0. The distance between any two points is always non-negative.
  2. Identity of indiscernibles: d(x, y) = 0 if and only if x = y. Zero distance implies identical points.
  3. Symmetry: d(x, y) = d(y, x). The distance from x to y equals the distance from y to x. In other words, distance is bidirectional.
  4. Triangle inequality: d(x, z) ≤ d(x, y) + d(y, z). The distance from x to z is at most the sum of the distances from x to y and y to z. Direct paths are never longer than indirect ones.

Common Examples:

Definition: A metric space (M, d) (or a subset of a metric space, $S \subseteq M$) is said to be complete if every Cauchy sequence in that space M (or subset S) converges to a limit that is also within that space (or subset).

Idea🧠: No “missing” limits. If a sequence “should” converge (it is a Cauchy sequence), it must do so within the space.

Key Examples

  1. Incomplete Space: $\mathbb{Q}$. Consider the set of rational numbers $\mathbb{Q}$ (with the usual absolute value as the metric).
    The sequence defined by the decimal approximations of $\sqrt{2}$ (1, 1.4, 1.41, 1.414, …) is obviously a Cauchy sequence in $\mathbb{Q}$.
    However, the limit of this sequence is $\sqrt{2}$, which is not a rational number ($\sqrt{2} \notin \mathbb{Q}$). Therefore, the set of rational numbers $\mathbb{Q}$ is not complete.
  2. Complete Spaces: $\mathbb{R}$ and $\mathbb{C}$.On the other hand, the set of real numbers ℝ (with the usual absolute value as the metric) is complete. Any Cauchy sequence of real numbers converges to a limit that is also a real number.
    Similarly, the complex plane ℂ is complete.
  3. The interval $[0, 1] \subset \mathbb{R}$ is complete (closed and bounded).
    Let $\{ x_n \} \subset [0, 1]$ be Cauchy. Since $\mathbb{R}$ is complete, $x_n \to L$ for some $: \in \mathbb{R}$.
    Since [0, 1] is closed, it contains all its limit points. Therefore, $L \in [0, 1]$.

    Theorem. A closed subset of a complete metric space is complete.

  4. The open interval $(0, 1) \subset \mathbb{R}$ is NOT complete (sequence 1/n is Cauchy but converges to 0 ∉ (0, 1)).

Theorem. The complex real is complete.

Least Upper Bound Property (LUBP) states that every nonempty subset of ℝ that is bounded above has a least upper bound (supremum) in ℝ.

Proof.

  1. Every Cauchy sequence is bounded. First, we must ensure the sequence doesn’t “blow up” to infinity.
    Let $\{x_n\}$ be a Cauchy sequence. Choose $\varepsilon = 1$. By the definition of Cauchy sequences, there exists an integer $N$ such that for all $n, m \geq N$, $|x_n - x_m| < 1$.
    Fix $m = N$. Then, $\forall n \geq N:|x_n - x_N| < 1 \implies x_N - 1 < x_n < x_N + 1$. This shows that the “tail” of the sequence ($n \geq N$) is bounded.
    The set of all terms is $\{x_1, x_2, \dots, x_{N-1}, x_N, \dots\}$. The first $N-1$ terms are finite in number and thus have a maximum and minimum.
    Let $B = \max(|x_1|, |x_2|, \dots, |x_{N-1}|, |x_N| + 1)$. Then $|x_n| \leq B$ for all $n$. Conclusion: The sequence $\{x_n\}$ is bounded.
  2. We now construct a new sequence based on the tails of the original sequence to pin down the limit. Consider the set of terms starting from index $n$: $A_n = \{x_k : k \geq n\}$.
    From the previous discussion, we know the entire sequence is bounded above by $B$. Thus, every set $A_n$ is non-empty and bounded above.
    By the Least Upper Bound Property (LUBP), the supremum of $A_n$ exists in $\mathbb{R}$. Let us define this as $s_n, s_n = \sup \{ x_k : k \geq n \}$
    Monotonicity: Notice that $A_{n+1} \subseteq A_n$ (the set $A_{n+1}$ is just $A_n$ with the element $x_n$ removed). Removing an element from a set cannot increase its supremum. Therefore, $s_{n+1} \leq s_n$. Thus, $\{s_n\}$ is a monotone decreasing sequence.
    Bounded Below: Since the original sequence $\{x_n\}$ is bounded below (by $-B$), the sequence of supremums $\{s_n\}$ is also bounded below by $-B$.
    By the Monotone Convergence Theorem (a direct consequence of LUBP), a monotone decreasing sequence that is bounded below must converge. Let $L = \lim_{n \to \infty} s_n$.

    Because $\{ x_n \}$ is Cauchy, the terms eventually cluster tightly. Because $s_n$ is the supremum of the tail, it must squeeze down toward the same cluster point.

  3. We now show that the original sequence $\{x_n\}$ is “squeezed” toward this limit $L$.
  4. Let $\varepsilon > 0$ be arbitrary. Since $\{x_n\}$ is Cauchy, there exists an $N$ such that for all $n, k \geq N: |x_k - x_n| < \frac{\varepsilon}{2}$. This inequality can be rewritten as: $x_k < x_n + \frac{\varepsilon}{2} \quad \text{for all } k \geq n$
  5. Since $x_n + \frac{\varepsilon}{2}$ is an upper bound for all $x_k$ (where $k \ge n$), the least upper bound ($s_n$) must be less than or equal to it: $s_n \leq x_n + \frac{\varepsilon}{2} (\star)$
  6. We rearrange this last inequality $(\star)$ to bound the distance between the sequence and its tail supremum: $s_n - \frac{\varepsilon}{2} \leq x_n.$
    Since $s_n$ is the supremum of the tail, we trivially know $x_n \leq s_n$.
    Combining these two inequalities, we get: $s_n - \frac{\varepsilon}{2} \leq x_n \leq s_n \implies 0 \leq s_n - x_n \leq \frac{\varepsilon}{2} \implies |s_n - x_n| \leq \frac{\varepsilon}{2}$
  7. We want to show that $|x_n - L| < \varepsilon$. Using the Triangle Inequality: $|x_n - L| \leq |x_n - s_n| + |s_n - L|$.
    We have previously established that $|x_n - s_n| \leq \frac{\varepsilon}{2}$ for $n \geq N$. Since $s_n \to L$, we can choose $N$ large enough such that $|s_n - L| < \frac{\varepsilon}{2}$.
  8. Combining these: $|x_n - L| < \frac{\varepsilon}{2} + \frac{\varepsilon}{2} = \varepsilon$. Therefore, $x_n \to L$. $\blacksquare$

Theorem. The complex plane ℂ (with the standard metric) is complete.

Proof.

  1. Let $\{ z_n \}$ be a Cauchy sequence in ℂ. Write $z_n = x_n + iy_n$.
  2. We aim to show that $\{ x_n \}$ and $\{ y_n \}$ are Cauchy in ℝ.
    For m, n ≥ N with $|z_m - z_n| < ε: |x_m - x_n| = |\text{Re}(z_m - z_n)| \leq |z_m - z_n| < \varepsilon$ and $|y_m - y_n| = |\text{Im}(z_m - z_n)| \leq |z_m - z_n| < \varepsilon$
  3. Use completeness of ℝ. Since ℝ is complete, there exist x, y ∈ ℝ with $x_n \to x$ and $y_n \to y$.
  4. Goal: Show $z_n \to z = x + iy$.
    For ε > 0, choose N such that n ≥ N implies, $|x_n - x| < \varepsilon/\sqrt{2}$ and $|y_n - y| < \varepsilon/\sqrt{2}$.
    Then, $|z_n - z| = \sqrt{(x_n - x)^2 + (y_n - y)^2} < \sqrt{\frac{\varepsilon^2}{2} + \frac{\varepsilon^2}{2}} = \varepsilon$. Therefore $z_n \to z \in \mathbb{C}$. ∎
Bitcoin donation

JustToThePoint Copyright © 2011 - 2026 Anawim. ALL RIGHTS RESERVED. Bilingual e-books, articles, and videos to help your child and your entire family succeed, develop a healthy lifestyle, and have a lot of fun. Social Issues, Join us.

This website uses cookies to improve your navigation experience.
By continuing, you are consenting to our use of cookies, in accordance with our Cookies Policy and Website Terms and Conditions of use.