JustToThePoint English Website Version
JustToThePoint en español

Complex Differentiation and Jacobian Structure

You have enemies? Good. That means you’ve stood up for something, sometime in your life, Winston Churchill

image info

The Product Rule for $\partial_x$ and $\partial_y$

Theorem. Suppose f, g: $\mathbb{C} \to \mathbb{C}$ have partial derivatives which exist then:

  1. $∂_x(f±g) = ∂_x(f) ± ∂_x(g), ∂_y(f±g) = ∂_y(f) ± ∂_y(g)$
  2. $∂_x(fg) = (∂_x(f))g + f(∂_x(g)), ∂_y(fg) = (∂_y(f))g + f(∂_y(g))$ (Product rule)

Proof (Product rule)

  1. Decomposition into Real/Imaginary Parts. Let f = u + iv, g = a + ib where u, v, a, b: $\mathbb{C} \to \mathbb{C}$ are real-valued functions.
    The product fg expands as follows: fg = (u + iv)(a + ib) = $\underbrace{ua -vg}_\text{Real part} + i\underbrace{ub +va}_\text{Imaginary part}$.
  2. Partial Derivative with Respect to x: $\frac{\partial}{\partial x}(fg) = \frac{\partial}{\partial x}(ua - vb) + i\frac{\partial}{\partial x}(ub + va)$
    Apply the real-valued product rule to each term: $\frac{\partial}{\partial x}(ua - vb) = (\frac{\partial u}{\partial x}a + u\frac{\partial a}{\partial x}) -(\frac{\partial v}{\partial x}b + v\frac{\partial b}{\partial x})$
    $\frac{\partial}{\partial x}(ub + va) = (\frac{\partial u}{\partial x}b + u\frac{\partial b}{\partial x}) + (\frac{\partial v}{\partial x}a + v\frac{\partial a}{\partial x})$
    Combining these results: $\frac{\partial}{\partial x}(fg) = [(\frac{\partial u}{\partial x}a-\frac{\partial v}{\partial x}b)+i(\frac{\partial u}{\partial x}b + \frac{\partial v}{\partial x}a)]+[(u\frac{\partial a}{\partial x}-v\frac{\partial b}{\partial x})+i(u\frac{\partial b}{\partial x} + \frac{\partial v}{\partial x}a)]$
  3. Factorization to Recover the Complex Product Rule: $\frac{\partial}{\partial x}(fg) = \underbrace{(\frac{\partial u}{\partial x} + i\frac{\partial v}{\partial x})}_{\frac{\partial f}{\partial x}}(a+ib) + (u+iv)\underbrace{(\frac{\partial a}{\partial x} + i\frac{\partial b}{\partial x})}_{\frac{\partial g}{\partial x}}$
    Substitute f = u + iv and g = a + ib:
    $\frac{\partial}{\partial x}(fg) = (\frac{\partial f}{\partial x})g + f(\frac{\partial g}{\partial x})$
  4. Case When a, b Are Independent of x. If $\frac{\partial a}{\partial x} = \frac{\partial b}{\partial x} = 0$ (e.g., a and b depend only on y), the expression simplifies to: $\frac{\partial}{\partial x}(fg) = (\frac{\partial u}{\partial x}a-\frac{\partial v}{\partial x}b)+i(\frac{\partial u}{\partial x}b + \frac{\partial v}{\partial x}a) = (\frac{\partial f}{\partial x})g$
  5. The proof for $\frac{\partial}{\partial x}(fg)$ follows identically by replacing x with y in the decomposition and differentiation steps. The structure remains unchanged because partial derivatives with respect to y operate analogously to those with respect to x.

Exercises

Key insights:

Jacobians of Complex Functions

A complex function f: $\mathbb{C} \to \mathbb{C}$ can be written as f(z) = u(x, y) + iv(x, y), giving a map F: $\mathbb{R}^2 \to \mathbb{R}^2$: $F(x, y) = (u(x,y), v(x,y))$. Its Jacobian is the 2 x 2 matrix $J_F = \begin{pmatrix} u_x & u_y \\ v_x & v_y \end{pmatrix}$.

For f to be complex differentiable (holomorphic), the Cauchy–Riemann equations must hold: $u_x = v_y, u_y = -v_x$.

Under these conditions, the Jacobian takes the special form $J_F = \begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ where $a = u_x = v_y$ and $b = v_x = -u_y$.

The matrix $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ represents multiplication by the complex number w = a + ib.

Proof. Multiplication by w = a + ib acts on z = x + iy: $w \cdot z = (a + ib)(x + iy) = (ax - by) + i(bx + ay)$, which is exactly f′(z).

As a map $\mathbb{R}^2 \to \mathbb{R}^2$: $(x, y) \mapsto (ax - by, bx + ay) = \begin{pmatrix} a & -b \\ b & a \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}$

The matrix $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ can be written in polar form: $\begin{pmatrix} a & -b \\ b & a \end{pmatrix} = r \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix}$
where $r = \sqrt{a^2 + b^2} = |w|$ and $\theta = \arg(w)$, w = a + ib.

This is a rotation by θ followed by scaling by r —exactly what complex multiplication does!

Determinant of Cauchy-Riemann Matrices

$\det\begin{pmatrix} a & -b \\ b & a \end{pmatrix} = a^2 + b^2 = r^2 = |w|^2$

Interpretation: Complex-differentiable functions scale areas by |f’(z)|².

Extended Examples

Dot Product as a Multilinear Function

Let f: ℝ³ × ℝ³ → ℝ be the dot product of two vectors in ℝ³ defined by: $f(\vec{x}, \vec{y}) = \vec{x} \cdot \vec{y} = x_1 y_1 + x_2 y_2 + x_3 y_3$

  1. Partial derivatives:
    $\frac{\partial f}{\partial x_1} = y_1, \quad \frac{\partial f}{\partial x_2} = y_2, \quad \frac{\partial f}{\partial x_3} = y_3$
    $\frac{\partial f}{\partial y_1} = x_1, \quad \frac{\partial f}{\partial y_2} = x_2, \quad \frac{\partial f}{\partial y_3} = x_3$
  2. Gradient: $\nabla f = (y_1, y_2, y_3, x_1, x_2, x_3)^T$
  3. Jacobian (1 × 6 row vector): $J_f = (y_1, y_2, y_3, x_1, x_2, x_3)$
  4. Differential: $\boxed{df_{(\vec{x},\vec{y})}(\Delta\vec{x}, \Delta\vec{y}) = \vec{y} \cdot \Delta\vec{x} + \vec{x} \cdot \Delta\vec{y}}$
    The differential of a function at a point $(\vec{x},\vec{y})$, denoted by $df_{(\vec{x},\vec{y})}$ is a linear transformation that best approximates the change in f for small changes in its arguments.
    Because our input is ($\vec{x}$, $\vec{y}$) ∈ ℝ³ x ℝ³, a small change around that point will be ($Δ\vec{x}$, $Δ\vec{y}$). The differential $df_{(\vec{x},\vec{y})}$ is a linear transformation in these increments $Δ\vec{x}$ and $Δ\vec{y}$.

So if we have small increments in $\vec{x}$ and $\vec{y}$, denoted by $Δ\vec{x}$ = (Δx₁, Δx₂, Δx₃) and $Δ\vec{y}$ = (Δy₁, Δy₂, Δy₃), then the approximate change in f is given by: $df_{(\vec{x},\vec{y})}(Δ\vec{x}, Δ\vec{y})$.

It can be found by taking the dot product of $∇f(\vec{x},\vec{y})$ with the increment vector: $df_{(\vec{x},\vec{y})}(Δ\vec{x}, Δ\vec{y}) = [y_1, y_2,y_3,x_1,x_2,x_3] * (Δx_1, Δx_2, Δx_3, Δy_1, Δy_2, Δy_3)^T = y_1Δx_1 + y_2Δx_2 + y_3Δx_3 + x_1Δy_1 + x_2Δy_2 + x_3 = (\vec{y}·Δ\vec{x}) + (\vec{x}·Δ\vec{y})$ where $(\vec{y}·Δ\vec{x})$ is the change contributed by $Δ\vec{x}$ while $\vec{y}$ is held fixed, and analogously $(\vec{x}·Δ\vec{y})$ is the change contributed by $Δ\vec{y}$ while $\vec{x}$ is held fixed.

$df_{(\vec{x},\vec{y})}(\Delta\vec{x}, \Delta\vec{y}) = \vec{y} \cdot \Delta\vec{x} + \vec{x} \cdot \Delta\vec{y}$. Thus, the change in the dot product consists of the sum of these two small changes, and this argument perfectly aligns with our intuition about how the dot product operates.

This is the product rule for the dot product: $d(\vec{x} \cdot \vec{y}) = d\vec{x} \cdot \vec{y} + \vec{x} \cdot d\vec{y}$

Bitcoin donation

JustToThePoint Copyright © 2011 - 2026 Anawim. ALL RIGHTS RESERVED. Bilingual e-books, articles, and videos to help your child and your entire family succeed, develop a healthy lifestyle, and have a lot of fun. Social Issues, Join us.

This website uses cookies to improve your navigation experience.
By continuing, you are consenting to our use of cookies, in accordance with our Cookies Policy and Website Terms and Conditions of use.