You cannot achieve the impossible without attempting the absurd.
Assumption is the mother of all screw-ups.
A function of two variables f: ℝ x ℝ → ℝ assigns to each ordered pair in its domain a unique real number, e.g., Area = $\frac{1}{2}b·h$, z = f(x, y) = 2x + 3y, f(x, y) = x2 + y2, ex+y, etc.
Partial derivatives are derivatives of a function of multiple variables, say f(x1, x2, ···, xn) with respect to one of those variables, with the other variables held constant. They measure how the function changes as one variable changes, while keeping the others constant. Examples: f(x, y) = 2x2y3, $\frac{\partial f}{\partial x} = f_x = 4x·y^3, \frac{\partial f}{\partial y} = f_y = 6x^2·y^2$; f(x, y) = x3y + y2, $\frac{\partial f}{\partial x} = f_x = 3x^2·y, \frac{\partial f}{\partial y} = f_y = x^3+2y.$
Let f(x1, x2, ···, xn) be a function of n variables. A point (x1, x2, ···, xn) in the domain of f is considered a critical point if the gradient of f at that point is the zero vector or if the gradient does not exist. The gradient ∇f is a vector of partial derivatives of f: ∇f = ($\frac{∂f}{∂x_1}, \frac{∂f}{∂x_2}, ···, \frac{∂f}{∂x_n}$).
In particular, n = 2, this means that a critical point (x0, y0), $\frac{\partial f}{\partial x}(x_0, y_0) = 0$ and $\frac{\partial f}{\partial y}(x_0, y_0) = 0$
When we analyze a quadratic function of two variables, w = ax2 + bxy + cy2, we’re looking at how the function behaves around certain points, especially the origin (0, 0). The origin is a critical point where the partial derivatives with respect to x and y are zero.
Consider w = x2 +2xy + 3xy2 =[We can rewrite this function as:] (x+y)2 + 2y2.
This is the sum of two squares, which means the value of w is always non-negative. Since w = 0 at the origin (0, 0), the origin is a minimum.
For a general quadratic function of two variables w = ax2 + bxy + cy2 (assuming a ≠ 0, indicating a true second-degree equation), we can rewrite the function to understand the nature of the critical points better: w = $a(x^2+\frac{b}{a}xy) +cy^2$ =[Completing the square, i.e., consider that $a(x+\frac{b}{2a}y)^2 = ax^2 +a2x\frac{b}{2a}y + a\frac{b^2}{4a^2}y^2 = ax^2 +bxy + \frac{b^2}{4a}y^2$] $a(x+\frac{b}{2a}y)^2+(c-\frac{b^2}{4a})y^2 =[\text{Factoring and Simplifying}] \frac{1}{4a}[4a^2(x+\frac{b}{2a}y)^2+(4ac-b^2)y^2]$
This reformulation allows us to examine the nature of the critical points depending on the sign of the discriminant 4ac−b2:
If 4ac -b2 < 0 ⇒ $4a^2(x+\frac{b}{2a}y)^2 ≥ 0, (4ac-b^2)y^2 ≤ 0$ ⇒ The positive and negative contributions indicates a saddle point at the origin, where the function takes on both positive and negative values around the critical point.
If 4ac -b2 = 0 ⇒ w = $a(x+\frac{b}{2a}y)^2$. For example, w = x2 represents a flat valley along the y-axis (Figure D). The critical points lie along the y-axis, making it a degenerate critical point.
If 4ac -b2 > 0 ⇒$4a^2(x+\frac{b}{2a}y)^2+(4ac-b^2)y^2 > 0$, then we have the sum of two squares, and there are two cases:
We can also express the function in another way to make it more intuitive: w = ax2 + bxy + cy2 = $y^2[a(\frac{x}{y})^2+b(\frac{x}{y})+c]$, hence y2 ≥ 0.
We can reformulate $a(\frac{x}{y})^2+b(\frac{x}{y})+c$ as at2 +bt +c where t = $\frac{x}{y}$.
A critical points of a function f in one variable is an x-value within the domain D of f where the derivative f’(x) is either zero or undefined. Notice that the sign of f’ must stay constant between two consecutive critical points.
If the derivative of a function changes sign around a critical point, the function is said to have a local or relative extremum (maximum or minimum) at that point. If f’ changes sign from positive (increasing function) to negative (decreasing function), the function has a local or relative maximum at that critical point. Similarly, if f’ changes sign from negative to positive, the function has a local or relative minimum.
In multivariable calculus, the second derivative test help us determine the nature of critical points (whether they are local maxima, minima, or saddle points). This test involves calculating the second partial derivatives and using them to analyze the behavior of the function near the critical points.
We calculate the second partial derivatives of the function with respect to both variables x and y, i.e., $\frac{\partial^2 f}{\partial x^2} = f_{xx} = A, \frac{\partial^2 f}{\partial y^2} = f_{yy} = C, f_{xy} = \frac{\partial^2 f}{\partial x \partial y}= \frac{\partial^2 f}{\partial y \partial x} = f_{yx} = B$.
The Second Derivative Test. Suppose that (x0, y0) is a critical point of f(x, y) (i.e., fx(x0, y0) = 0, fy(x0, y0) = 0) and that the second order partial derivatives of f are continuos in some region that contains (x0, y0). Let D = fxx(x0, y0)fyy(x0, y0)-[fxy(x0, y0)]2.
Find the critical points of the function f(x, y) by solving fx(x, y) = 0 and fy(x, y) = 0 where fx and fy are the first partial derivatives of f with respect to x and y, respectively.
Calculate the second partial derivatives. Let A = $f_{xx}(x_0,y_0)= \frac{\partial^2 f}{\partial x^2}(x_0, y_0); B = f_{xy}(x_0, y_0) = \frac{\partial^2 f}{\partial x\partial y}(x_0, y_0) = \frac{\partial^2 f}{\partial y\partial x}(x_0, y_0); C = f_{yy}(x_0, y_0) = \frac{\partial^2 f}{\partial y^2}(x_0, y_0).$
Compute the Hessian matrix H at each critical point, H = $(\begin{smallmatrix}\frac{\partial^2 f}{\partial x^2} & \frac{\partial^2 f}{\partial x \partial y}\\ \frac{\partial^2 f}{\partial y \partial x} & \frac{\partial^2 f}{\partial y^2}\end{smallmatrix})$.
Calculate the determinant of the Hessian matrix D = fxx·fyy-(fxy)2= AC -B2.
Suppose f: ℝn → ℝ is a function taking as input a vector x ∈ ℝn and outputting a scalar f(x) ∈ ℝ. If all second-order partial derivatives of f exist, then the Hessian matrix $\text{H}$ of f is a square n x n matrix, defined and arranged as (Hf)i,j = $\frac{∂^2f}{∂x_i∂x_j}$
Analyze the sign of D and fxx at each critical point to determine the nature of the critical points :
Consider our previous example, a function f(x, y) = ax2 + bxy + cy2.
Notice that the previous results verify this theorem, and our last case corresponds to degenerate critical points.
The second derivative test relies on approximating the function near the critical point (x0, y0) using a quadratic (second-order) polynomial -the best quadratic approximation at the critical point.
Assume that f(x,y) is a function with continuous second derivatives in the neighborhood of a critical point (x0, y0). This approximation, called the Taylor series expansion, is given by:
$f(x, y) ≈ f(x_0, y_0) + f_x (x -x_0) + f_y(y - y_0) + \frac{1}{2}f_{xx}(x-x_0)^2+ f_{xy}(x-x_0)(y-y_0) + \frac{1}{2}f_{yy}(y-y_0)^2$
At a critical points, the first partial derivatives or linear terms in the Taylor series (fx, fy) are zero, simplifying the approximation to:
$f(x, y) ≈ f(x_0, y_0) + \frac{1}{2}f_{xx}(x-x_0)^2+ f_{xy}(x-x_0)(y-y_0) + \frac{1}{2}f_{yy}(y-y_0)^2$
For simplicity, let’s rewrite the quadratic approximation as:
$f(x, y) ≈ f(x_0, y_0) + \frac{1}{2}f_{xx}(x-x_0)^2+ f_{xy}(x-x_0)(y-y_0) + \frac{1}{2}f_{yy}(y-y_0)^2$, hence the general case reduces to the quadratic case where $\frac{1}{2}f_{xx} = \frac{1}{2}A, f_{xy} = B, \frac{1}{2}f_{yy} = \frac{1}{2}C$
$f(x, y) ≈ f(x_0, y_0) + \frac{1}{2}(A(x-x_0)^2+2b(x-x_o)(y-y_o)+ c(y-y_0)^2)$.
To further simplify the notation, let u = x -x0, v = y -y0.
This shifts the critical point (x0, y0) to the origin, transforming the quadratic approximation to:
$w(u, v) ≈ w_0 + \frac{1}{2}(Au^2 + 2Buv + Cv^2)$, here the coefficients A, B, and C are given by the second partial derivatives with respect to u and v at (0, 0), or what is the same by the second partial derivatives with respect to x and y at (x0, y0).
Since the quadratic function on the right is the best approximation of w and f, it is reasonable to suppose that their graphs are essentially the same near (0, 0) and (x0, y0) respectively, and so if the quadratic function has a maximum, minimum, or saddle point there, so will f(x, y). Thus, our results for the special case of a quadratic function carry over to the general function f(x, y) at a critical point (x0,y0).
The reason why this argument does not work in the degenerative case is that our quadratic approximation only works when the higher order terms (derivatives) are not very significant. In the degenerate case, this is not the case.
First, we find the partial derivatives of f with respect to x and y: $\frac{\partial f}{\partial x} = 2x, \frac{\partial f}{\partial y} = 2y.$
To find the critical points, we set these derivatives equal to zero: 2x = 0 ⇒ x = 0, 2y = 0 ⇒ y = 0.
Therefore, the only critical point is (0, 0). Since f(x,y) = x2 + y2 ≥ 0 for all x and y, and f(0, 0) = 0, this point is not just a local minimum, but a global minimum.
First, we find the partial derivatives of f with respect to x and y: $\frac{\partial f}{\partial x} = 3x^2-3y^2, \frac{\partial f}{\partial y} = -6xy.$
To find the critical points, we set these derivatives equal to zero: $3x^2-3y^2 = 0 ⇒ x^2 = y^2 ⇒ x=±y, -6xy = 0 ⇒ x = 0~\text{or}~y = 0$
Combining these equations, we get that the only critical point is (0, 0).
Evaluate the second partial derivatives and use the second derivative test:
fxx = 6x, fyy = -6x, fxy = -6y
We calculate the determinant of the Hessian matrix H, D = fxx(x0, y0)fyy(x0, y0)-[fxy(x0, y0)]2 = (6·0)(-6·0·0)-(6·0)2 = 0 ⇒ Since D = 0, the second derivative test is inconclusive. This means that we cannot determine the nature of the critical point (0, 0) using the second derivative test alone.
First, we find the partial derivatives of f with respect to x and y: $\frac{\partial f}{\partial x} = 2x -2y +2, \frac{\partial f}{\partial y} = -2x +6y -2$
To find the critical points, we set these partial derivatives equal to zero: $\begin{cases} 2x -2y +2 = 0 \\ -2x +6y -2 = 0 \end{cases}$
We now solve this system of equations to find the critical points.
First, let’s add the two equations together to eliminate x: (2x−2y+2)+(−2x+6y−2)=0 ⇒[Simplifying this, we get:] 4y = 0 ⇒y = 0.
Now, substitute y = 0 into the first equation, 2x -2y +2 = 0: 2x -2·0 + 2 = 0 ⇒ 2x + 2 = 0 ⇒ x = -1. Therefore, the critical point is (-1, 0)
To determine whether (−1, 0) is a local minimum, maximum, or saddle point, we complete the square for the function f(x, y).
f(x, y) = x2 -2xy +3y2 + 2x -2y =[Let’s rewrite f(x, y) by completing the square] (x-y)2 +2y2+2x -2y =[Complete the square again] ((x-y) + 1)2 + 2y2 -1 ≥ [Since obviously ((x-y) + 1)2≥ 0, 2y2 ≥ 0 ] -1 = f(-1, 0), hence (-1, 0) is a minimum.
Find the first partial derivatives fx and fy: fx = -4x + 2y + 12, fy = 2x -2y -4, fxx = -4, fyy = -2, fxy = 2.
Solve the system of equations fx = 0 and fy = 0 to find critical points:
$\begin{cases} -4x + 2y + 12 = 0 (i)\\ 2x -2y -4 = 0 (ii)\end{cases}$
Adding these two equations (i) + (ii): -2x + 8 = 0⇒x = $\frac{-8}{-2} = 4$.
Substitute x = 4 into (i): -4·4 + 2y + 12 = 0 ↭ -4 + 2y = 0 ↭ y = $\frac{4}{2}= 2.$ Therefore, the critical point is (4, 2).
Evaluate the second partial derivatives and use the second derivative test:
The second derivative test involves computing the determinant D of the Hessian matrix H: D = fxx(x0, y0)fyy(x0, y0)-[fxy(x0, y0)]2 = (-4)·(-2)-22 = 8 -4 = 4.
Since D > 0 and fxx(x0, y0) = -4 < 0, then f(4, 2) = 13 is a relative (local) maximum.
Find the first partial derivatives fx and fy: fx = 3x2 -3y, fy = 3y2 -3x, fxx = 6x, fyy = 6y, fxy = -3.
Solve the system of equations fx = 0 and fy = 0 to find critical points:
$\begin{cases} 3x^2-3y = 0 (i)\\ 3y^2-3x = 0 (ii)\end{cases}$
From (i): 3x2 -3y = 0 ⇒ y = x2.
Substitute y = x2 into the second equation (ii): 3(x2)2 -3x = 0 ↭ 3x(x3-1) = 0⇒ x = 0 or x = 1 ⇒ y = x2. So, the critical points are (0, 0) and (1, 1).
Evaluate the second partial derivatives and use the second derivative test:
The second derivative test involves computing the determinant D of the Hessian matrix H: D = fxx(x0, y0)fyy(x0, y0)-[fxy(x0, y0)]2 = (6x)(6y)-(-3)2 = 36xy -9.
At the critical point (0, 0): D(0, 0) = 36·0·0 -9 = -9 < 0⇒(0, 0) is a saddle point.
At the critical point (1, 1): D(1, 1) = 36·1·1 -9 = 27 > 0 and fxx(x0, y0) = fxx(1, 1) = 6 > 0 ⇒(1, 1) is a relative (local) minimum with f(1, 1) = 3.
To find the critical points, we first need to compute the partial derivatives of f with respect to x and y: $f_x = 2y -\frac{32}{x^2}, f_y = 2x -\frac{64}{y^2}$.
To find the critical points, we set these partial derivatives equal to zero and solve for x and y:
$2y -\frac{32}{x^2} = 0, 2x -\frac{64}{y^2} = 0$
From the first equation: $2y = \frac{32}{x^2} ⇒ y = \frac{16}{x^2}$.
Substitute $y = \frac{16}{x^2}$ into the second equation: $2x = \frac{64}{\frac{16^2}{x^4}} = \frac{64}{\frac{256}{x^4}} = \frac{64x^4}{256} = \frac{x^4}{4}.$
Simplify and solve for x: $2x = \frac{x^4}{4} ⇒ 8x = x^4 ⇒ x(x^3-8) = 0$. This gives us x = 0 or x3 = 8 (x = 2).
Since x = 0 is not valid (division by zero), we have x = 2. Substitute x = 2 back into $y = \frac{16}{x^2}, y = \frac{16}{4} = 4$. Thus, the only critical point is (2, 4).
Let’s verify that (2, 4) is indeed a critical point by plugging it back into the partial derivatives: $f_x(2, 4) = 8-\frac{32}{4} = 8-8 = 0; f_y(2, 4) =4-\frac{64}{16} = 4 -4 = 0.$
Now, we compute the second partial derivatives to use the second derivative test: $f_{xx} = \frac{64}{x^3}, f_{xy} = f_{yx} =2, f_{yy} = \frac{128}{y^3}$.
Then, evaluating these at (2, 4), we get $f_{xx} = \frac{64}{x^3}|_{(2, 4)} = 8$.
$f_{xy} = f_{yx} =2.$
$f_{yy} = \frac{128}{y^3}|_{(2, 4)} =2$ ⇒
The second derivative test involves computing the determinant D of the Hessian matrix H: AC - B2 = det$(\begin{smallmatrix}8 & 2\\ 2 & 2\end{smallmatrix})$ = 8·2-2^2 = 12 > 0 and $f_{xx}> 0$ ⇒(2, 4) is a local minimum.
To find the critical points, we first compute the partial derivatives: $f_x = 3x^2 -3 = 3(x^2-1) = 3(x-1)(x+1), f_y = 3y^2-6y = 3(y^2-2y)=3·y·(y-2).$
Set the partial derivatives equal to zero: 3(x-1)(x+1) = 0 ↭ x = ±1. 3(y^2 -2y) = 3·y·(y-2) = 0 ↭ y = 0 or y = 2. Thus, there are four critical points, namely (1, 0), (1, 2), (-1, 0), and (-1, -2).
Now, compute the second partial derivatives, $f_{xx} = 6x, f_{xy} = 0, f_{yy} = 6y -6$.
The second derivative test involves computing the determinant D of the Hessian matrix H: D = AC -B2 = (6x)(6y-6)-02 = 36xy -36x.
$D(1, 2) = 36xy -36|_{(1, 2)}=36·1·2-36·1=36 > 0$, A = 6·1 > 0 ⇒ (1, 2) is a local minimum.
$D(-1, 2) = 36xy -36|_{(-1, 2)}=36·(-1)·2-36 = -36 < 0$ ⇒ (-1, 2) is a saddle point.
$D(1. 0) = 36xy -36|_{(1, 0)}=36·1·0 -36·1 = -36 < 0$ ⇒ (1, 0) is a saddle point.
$D(-1, 0) = 36xy -36|_{(-1, 0)}=36·(-1)·0-36·(-1)=36 > 0$, A = 6·(-1) < 0 ⇒ (-1, 0) is a local maximum.
To find the critical points, we first compute the partial derivatives: $f_x =1-\frac{1}{x^2y}, f_y = 1-\frac{1}{xy^2}$.
Set the partial derivatives equal to zero: $1-\frac{1}{x^2y} = 0, 1-\frac{1}{xy^2} = 0$.
$1-\frac{1}{x^2y} = 0 ⇒ \frac{1}{x^2y} = 1 ⇒ x^2y = 1$ (i)
$1-\frac{1}{xy^2} = 0 ⇒ \frac{1}{xy^2} = 1 ⇒ xy^2 = 1$ (ii)
Divide (i) by (ii): xy-1 = 1 ⇒ x = y.
Substitute x = y into (ii): x3 = 1 ⇒[By assumption, x, y > 0] x = 1 ⇒[x = y] y = 1. Thus, there is only one critical point (1, 1).
Let’s compute the second partial derivatives, $f_{xx}=\frac{2}{x^3y}, f_{xy}=\frac{1}{x^2y^2}, f_{yy}=\frac{2}{xy^3}$.
Evaluate the Second Partial Derivatives at the Critical Point (1, 1): A = 2, B = 1, and C = 2.
The second derivative test involves computing the determinant D of the Hessian matrix H: D = AC -B2 = 2·2 -12 = 3 > 0, and A = fxx > 0.
Since D > 0 and fxx > 0, the critical point (1, 1) is a local minimum.
Consider the Behavior at Infinity:
$\lim_{x \to ∞} x + y + \frac{1}{xy} = \lim_{y \to ∞} x + y + \frac{1}{xy} = ∞.$
Similarly, consider the limits as x → 0 or y → 0:
$\lim_{x \to 0} x + y + \frac{1}{xy} = \lim_{y \to 0} x + y + \frac{1}{xy} = ∞$.
The function increases without bound (the maximum is achieved) as x or y approaches infinity or zero, and they are not critical points, so in general we need to check both the critical points and the boundaries to know what happens.