You have enemies? Good. That means you’ve stood up for something, sometime in your life, Winston Churchill

The Product Rule for $\partial_x$ and $\partial_y$
Theorem. Suppose f, g: $\mathbb{C} \to \mathbb{C}$ have partial derivatives which exist then:
- $∂_x(f±g) = ∂_x(f) ± ∂_x(g), ∂_y(f±g) = ∂_y(f) ± ∂_y(g)$
- $∂_x(fg) = (∂_x(f))g + f(∂_x(g)), ∂_y(fg) = (∂_y(f))g + f(∂_y(g))$ (Product rule)
Proof (Product rule)
- Decomposition into Real/Imaginary Parts. Let f = u + iv, g = a + ib where u, v, a, b: $\mathbb{C} \to \mathbb{C}$ are real-valued functions.
The product fg expands as follows: fg = (u + iv)(a + ib) = $\underbrace{ua -vg}_\text{Real part} + i\underbrace{ub +va}_\text{Imaginary part}$.
- Partial Derivative with Respect to x: $\frac{\partial}{\partial x}(fg) = \frac{\partial}{\partial x}(ua - vb) + i\frac{\partial}{\partial x}(ub + va)$
Apply the real-valued product rule to each term: $\frac{\partial}{\partial x}(ua - vb) = (\frac{\partial u}{\partial x}a + u\frac{\partial a}{\partial x}) -(\frac{\partial v}{\partial x}b + v\frac{\partial b}{\partial x})$
$\frac{\partial}{\partial x}(ub + va) = (\frac{\partial u}{\partial x}b + u\frac{\partial b}{\partial x}) + (\frac{\partial v}{\partial x}a + v\frac{\partial a}{\partial x})$
Combining these results: $\frac{\partial}{\partial x}(fg) = [(\frac{\partial u}{\partial x}a-\frac{\partial v}{\partial x}b)+i(\frac{\partial u}{\partial x}b + \frac{\partial v}{\partial x}a)]+[(u\frac{\partial a}{\partial x}-v\frac{\partial b}{\partial x})+i(u\frac{\partial b}{\partial x} + \frac{\partial v}{\partial x}a)]$
- Factorization to Recover the Complex Product Rule: $\frac{\partial}{\partial x}(fg) = \underbrace{(\frac{\partial u}{\partial x} + i\frac{\partial v}{\partial x})}_{\frac{\partial f}{\partial x}}(a+ib) + (u+iv)\underbrace{(\frac{\partial a}{\partial x} + i\frac{\partial b}{\partial x})}_{\frac{\partial g}{\partial x}}$
Substitute f = u + iv and g = a + ib:
$\frac{\partial}{\partial x}(fg) = (\frac{\partial f}{\partial x})g + f(\frac{\partial g}{\partial x})$
- Case When a, b Are Independent of x. If $\frac{\partial a}{\partial x} = \frac{\partial b}{\partial x} = 0$ (e.g., a and b depend only on y), the expression simplifies to: $\frac{\partial}{\partial x}(fg) = (\frac{\partial u}{\partial x}a-\frac{\partial v}{\partial x}b)+i(\frac{\partial u}{\partial x}b + \frac{\partial v}{\partial x}a) = (\frac{\partial f}{\partial x})g$
- The proof for $\frac{\partial}{\partial x}(fg)$ follows identically by replacing x with y in the decomposition and differentiation steps. The structure remains unchanged because partial derivatives with respect to y operate analogously to those with respect to x.
Exercises
- Sine Function Expansion. Using the angle sum formula: sin(A + B) = sin(A)cos(B) + cos(A)sin(B).
sin(z) = sin(x + iy) = sin(x)cos(iy) + cos(x)sin(iy) = [Since cos(iy) = cosh(y) and sin(iy) = i·sinh(y)]
= sin(x)cosh(y) + icos(x)sinh(y).
Therefore, $\boxed{\sin(z) = \sin(x)\cosh(y) + i\cos(x)\sinh(y)}$.
- Cosine Function Expansion. Using the angle sum formula: cos(A + B) = cos(A)cos(B) - sin(A)sin(B)
cos(z) = cos(x +iy) = cos(x)cos(iy) - sin(x)sin(iy) =[Using the same substitutions as above: cos(iy) = cosh(y) and sin(iy) = i·sinh(y)]
= cos(x)cosh(y) -isin(x)sinh(y)
Therefore, $\boxed{\cos(z) = \cos(x)\cosh(y) -i\sin(x)\sinh(y)}$.
- Partial Derivatives with Respect to x. Since cosh(y) and sinh(y) are independent of x, so they are treated as constants when differentiating with respect to x.
For sin(z), $\frac{\partial}{\partial x}\sin(z) = \frac{\partial}{\partial x}(\sin(x)\cosh(y) + i\cos(x)\sinh(y)) = \cos(x)\cosh(y) -i\sin(x)\sinh(y) = \cos(z)$
For cos(z), $\frac{\partial}{\partial x}\cos(z) =\frac{\partial}{\partial x}(\cos(x)\cosh(y) -i\sin(x)\sinh(x)) = -\sin(x)\cosh(y)-i\cos(x)\sinh(y) = -\sin(z)$
- Application to z² = (x + iy)² = x² -y² +2xyi.
Substitute $u = x^2 -y^2, v = 2xy$ into sin(z²) and cos(z²).
sin(z²) = sin(x² -y²)cosh(2xy) + icos(x² -y²)sinh(2xy), cos(z²) = cos(x² -y²)cosh(2xy) -isin(x² -y²)sinh(2xy)
- Partial derivative of sin(z²) with respect to x.
Using the chain rule, $\frac{\partial}{\partial x}\sin(w) = cos(w)\cdot \frac{\partial w}{\partial x}$ where w = z².
$\frac{\partial}{\partial x}\sin(z^2) = \cos(z^2)\cdot \frac{\partial}{\partial x}(z^2) = \cos(z^2)\cdot 2z$.
Expanding explicitly: $\frac{\partial}{\partial x}\sin(z^2) = $ (2x + 2yi)(cos(x² -y²)cosh(2xy) -isin(x² -y²)sinh(2xy)) = 2xcos(x² -y²)cosh(2xy) +2ysin(x² -y²)sinh(2xy) + i[-2xsin(x² -y²)sinh(2xy) +2ycos(x² -y²)cosh(2xy)] = 2x[cos(x² -y²)cosh(2xy) -isin(x² -y²)sinh(2xy)] + 2y[icos(x² -y²)cosh(2xy) + sin(x² -y²)sinh(2xy)] = 2xcos(z²)+2iycos(z²) = 2(x + iy)cos(z²) = 2zcos(z²).
Key insights:
- Complex trigonometric functions decompose into real trigonometric and hyperbolic functions.
- Partial derivatives with respect to x mirror real-variable calculus due to the independence of y.
- The chain rule applies analogously for composite complex functions (e.g., $\sin(z^2)$).
- These results are foundational in complex analysis, linking trigonometric and hyperbolic functions via complex arguments.
Jacobians of Complex Functions
A complex function f: $\mathbb{C} \to \mathbb{C}$ can be written as f(z) = u(x, y) + iv(x, y), giving a map F: $\mathbb{R}^2 \to \mathbb{R}^2$: $F(x, y) = (u(x,y), v(x,y))$. Its Jacobian is the 2 x 2 matrix $J_F = \begin{pmatrix} u_x & u_y \\ v_x & v_y \end{pmatrix}$.
For f to be complex differentiable (holomorphic), the Cauchy–Riemann equations must hold: $u_x = v_y, u_y = -v_x$.
Under these conditions, the Jacobian takes the special form $J_F = \begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ where $a = u_x = v_y$ and $b = v_x = -u_y$.
The matrix $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ represents multiplication by the complex number w = a + ib.
Proof. Multiplication by w = a + ib acts on z = x + iy: $w \cdot z = (a + ib)(x + iy) = (ax - by) + i(bx + ay)$, which is exactly f′(z).
As a map $\mathbb{R}^2 \to \mathbb{R}^2$: $(x, y) \mapsto (ax - by, bx + ay) = \begin{pmatrix} a & -b \\ b & a \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}$
The matrix $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ can be written in polar form: $\begin{pmatrix} a & -b \\ b & a \end{pmatrix} = r \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix}$
where $r = \sqrt{a^2 + b^2} = |w|$ and $\theta = \arg(w)$, w = a + ib.
This is a rotation by θ followed by scaling by r —exactly what complex multiplication does!
Determinant of Cauchy-Riemann Matrices
$\det\begin{pmatrix} a & -b \\ b & a \end{pmatrix} = a^2 + b^2 = r^2 = |w|^2$
Interpretation: Complex-differentiable functions scale areas by |f’(z)|².
Extended Examples
- f(z) = z² = (x + iy)² = (x² - y²) + i(2xy)². $J_f = [∂_xf | ∂_yf] =(\begin{smallmatrix}2x & -2y\\\\ 2y & 2x\end{smallmatrix})$.
This has the special form $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$!
The function is holomorphic everywhere, and f’(z) = 2z.
- A non‑holomorphic function. Let f(x, y) = (x² + y², 2xy), $J_f = [∂_xf | ∂_yf] =(\begin{smallmatrix}2x & 2y\\\\ 2y & 2x\end{smallmatrix})$
This matrix is symmetric and cannot be written as $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$ unless y = 0.
Hence, the function is not complex differentiable (except perhaps at isolated points). Indeed, it fails the Cauchy–Riemann equations: $u_y = 2y, -v_x = -2y$, so they are equal only when y = 0.
- Maps to Higher‑Dimensional Spaces. Let f(x, y) = (x² + y², xy, x + y), the Jacobian is a 3 x 2 matrix whose columns are the partial derivatives with respect to x and y: $J_f = [∂_xf | ∂_yf] =\biggr(\begin{smallmatrix}2x & 2y\\\\ y & x\\\\ 1 & 1\end{smallmatrix}\biggl)$
- Change of Variables: Polar Coordinates.
Consider the transformation from polar coordinates $(r, \theta)$ to Cartesian coordinates (x, y): $F(r, \theta) = (rcos(\theta), rsin(\theta))$, the Jacobian is a 2 x 2 square matrix, $J_F = [∂_rF | ∂_θF] = (\begin{smallmatrix} cos(\theta) & -rsin(\theta)\\\\ sin(\theta) & rcos(\theta)\end{smallmatrix})$
Its determinant is $det_{J_R} = r(cos^2(\theta) + \sin^2(\theta)) = r$ which is the familiar area element $dx dy = rdr d\theta$ in double integrals.
This matrix is not of the special Cauchy–Riemann form unless r = 0.
- Let f(z) = eᶻ = eˣ(cos y + i sin y) where z = x + iy ∈ ℂ, we know that ez = ex(cosy + isin(y)), $u = e^x \cos y, \quad v = e^x \sin y$, therefore $J_f = [∂_xf | ∂_yf] =(\begin{smallmatrix}e^xcos(y) & -e^xsin(y)\\\\ e^xsin(y) & e^xcos(y)\end{smallmatrix})$
Again the special form $\begin{pmatrix} a & -b \\ b & a \end{pmatrix}$!
- f(z) = z̄ = x - iy. $u = x, \quad v = -y$.
$J_F = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}$.
This does NOT have the special form.
- Let f(z) =$3z + \bar z$ where z = x + iy, $\bar z = x -iy, f(x, y) = 3(x+iy) + (x-iy) = (4x, 2y), J_f = [∂_xf | ∂_yf] = \bigr(\begin{smallmatrix}4 & 0\\\\ 0 & 2\end{smallmatrix}\bigl)$. This does NOT have the special form.
- f(z) = |z|² = x² + y² (real-valued). $u = x^2 + y^2, \quad v = 0$, $J_F = \begin{pmatrix} 2x & 2y \\ 0 & 0 \end{pmatrix}$
Dot Product as a Multilinear Function
Let f: ℝ³ × ℝ³ → ℝ be the dot product of two vectors in ℝ³ defined by: $f(\vec{x}, \vec{y}) = \vec{x} \cdot \vec{y} = x_1 y_1 + x_2 y_2 + x_3 y_3$
- Partial derivatives:
$\frac{\partial f}{\partial x_1} = y_1, \quad \frac{\partial f}{\partial x_2} = y_2, \quad \frac{\partial f}{\partial x_3} = y_3$
$\frac{\partial f}{\partial y_1} = x_1, \quad \frac{\partial f}{\partial y_2} = x_2, \quad \frac{\partial f}{\partial y_3} = x_3$
- Gradient: $\nabla f = (y_1, y_2, y_3, x_1, x_2, x_3)^T$
- Jacobian (1 × 6 row vector): $J_f = (y_1, y_2, y_3, x_1, x_2, x_3)$
- Differential: $\boxed{df_{(\vec{x},\vec{y})}(\Delta\vec{x}, \Delta\vec{y}) = \vec{y} \cdot \Delta\vec{x} + \vec{x} \cdot \Delta\vec{y}}$
The differential of a function at a point $(\vec{x},\vec{y})$, denoted by $df_{(\vec{x},\vec{y})}$ is a linear transformation that best approximates the change in f for small changes in its arguments.
Because our input is ($\vec{x}$, $\vec{y}$) ∈ ℝ³ x ℝ³, a small change around that point will be ($Δ\vec{x}$, $Δ\vec{y}$). The differential $df_{(\vec{x},\vec{y})}$ is a linear transformation in these increments $Δ\vec{x}$ and $Δ\vec{y}$.
So if we have small increments in $\vec{x}$ and $\vec{y}$, denoted by $Δ\vec{x}$ = (Δx₁, Δx₂, Δx₃) and $Δ\vec{y}$ = (Δy₁, Δy₂, Δy₃), then the approximate change in f is given by: $df_{(\vec{x},\vec{y})}(Δ\vec{x}, Δ\vec{y})$.
It can be found by taking the dot product of $∇f(\vec{x},\vec{y})$ with the increment vector: $df_{(\vec{x},\vec{y})}(Δ\vec{x}, Δ\vec{y}) = [y_1, y_2,y_3,x_1,x_2,x_3] * (Δx_1, Δx_2, Δx_3, Δy_1, Δy_2, Δy_3)^T = y_1Δx_1 + y_2Δx_2 + y_3Δx_3 + x_1Δy_1 + x_2Δy_2 + x_3 = (\vec{y}·Δ\vec{x}) + (\vec{x}·Δ\vec{y})$ where $(\vec{y}·Δ\vec{x})$ is the change contributed by $Δ\vec{x}$ while $\vec{y}$ is held fixed, and analogously $(\vec{x}·Δ\vec{y})$ is the change contributed by $Δ\vec{y}$ while $\vec{x}$ is held fixed.
$df_{(\vec{x},\vec{y})}(\Delta\vec{x}, \Delta\vec{y}) = \vec{y} \cdot \Delta\vec{x} + \vec{x} \cdot \Delta\vec{y}$. Thus, the change in the dot product consists of the sum of these two small changes, and this argument perfectly aligns with our intuition about how the dot product operates.
This is the product rule for the dot product: $d(\vec{x} \cdot \vec{y}) = d\vec{x} \cdot \vec{y} + \vec{x} \cdot d\vec{y}$