For every problem there is always, at least, a solution which seems quite plausible. It is simple and clean, direct, neat and nice, and yet very wrong, #Anawim, justtothepoint.com
There are two ways to do great mathematics. The first is to be smarter than everybody else. The second way is to be stupider than everybody else — but persistent, Raoul Bott
An algebraic equation is a mathematical statement that declares or asserts the equality of two algebraic expressions. These expressions are constructed using:
Definition. A differential equation is an equation that involves one or more dependent variables, their derivatives with respect to one or more independent variables, and the independent variables themselves, e.g., $\frac{dy}{dx} = 3x +5y, y’ + y = 4xcos(2x), \frac{dy}{dx} = x^2y+y, etc.$
It involves (e.g., $\frac{dy}{dx} = 3x +5y$):
The Existence and Uniqueness Theorem provides crucial insight into the behavior of solutions to first-order differential equations ODEs. It states that if:
Then the differential equation y' = f(x, y) has a unique solution to the initial value problem through the point (x0, y0) .
A first-order linear differential equation (ODE) has the general form: a(x)y' + b(x)y = c(x) where y′ is the derivative of y with respect to x, and a(x), b(x), and c(x) are functions of x. If c(x) = 0, the equation is called homogeneous, i.e., a(x)y’ + b(x)y = 0.
The equation can also be written in the standard linear form as: y’ + p(x)y = q(x) where $p(x)=\frac{b(x)}{a(x)}\text{ and }q(x) = \frac{c(x)}{a(x)}$
A second-order linear homogeneous differential equation ODE with constant coefficients is a differential equation of the form: y'' + Ay' + By = 0 where:
This equation is homogeneous, meaning that there are no external forcing terms (like a function of t) on the right-hand side.
In this discussion, we aim to solve a system of linear differential equations of the form: $\vec{x’} = A\vec{x}$ where A is a constant 2 x 2 matrix, and $\vec{x} = (\begin{smallmatrix}x(t)\\ y(t)\end{smallmatrix})$ is the vector of unknown functions x(t) and y(t).
The goal is to solve a system of first-order linear differential equations using a change of variables that decouples the system, simplifying it into two independent equations. This approach allows us to find the solution more easily.
To decouple the system, we introduce new variables u(t) and v(t) defined by linear combinations of x(t) and y(t):
$\begin{cases} u = ax + by \\ v = cx + dy \end{cases}$
The problem is to find new variables u and v such that transform the system into a decoupled form, meaning that the equations for u and v are independent of each other. The new system should look like:
$\begin{cases} u’ = k_1u \\ v’ = k_2v \end{cases}$
where k1 and k2 are constants. This decoupled system consists of two separate first-order differential equations, which can be solved independently. Once we solve for u(t) and v(t), we can use the transformations to express the original variables x(t) and y(t) in terms of u(t) and v(t).
Consider an old ice cube tray with two compartments connected by a small opening that allows liquid to flow between them. Let x(t) and y(t) represent the height of the liquid in the first and second chamber at time t, respectively. Refer to Figure i for a visual representation and aid in understanding it.
The flow rate (cm/sec) between the compartments is proportional to the pressure difference, which is related to the height difference between the two chambers. We assume that the area of the second chamber is double that of the first.
The system is governed by the following differential equations that describe the flow of liquid:
$\begin{cases} x’ = c(y-x) \\ 2y’ = c(x-y) \end{cases}$
where:
To make the problem more simple and manageable, let’s assume c = 2. This simplifies the system to:
$\begin{cases} x’ = -2x + 2y \\ 2y’ = 2x -2y \end{cases}$
We can further simplify the second equation by dividing it by 2:
$\begin{cases} x’ = -2x + 2y \\ y’ = x -y \end{cases}$
We seek variables that can help us understand the system’s behavior and decouple it.
To find the dynamics of u(t), differentiate it with respect to time: u = x + 2y ⇒ u’ = x’ +2y’ = (−2x+2y) +2(x−y) = -2x + 2y +2x -2y = 0. Thus, we find that: u’(t) = 0. This confirms that u(t) is constant over time. This aligns with the conservation of mass. Solution for u(t) = c1 where c1 is a constant determined by the initial conditions.
Next, differentiate v(t) with respect to time: v = x -y ⇒ v’ = x’ -y’ = (−2x + 2y) − (x −y) = -2x + 2y -x + y = -3x + 3y = -3(x -y) = -3v ⇒ v’(t) = -3v(t). This is a simple first-order linear differential equation, and the solution is: v = c2e-3t where c2 is a constant determined by the initial conditions.
We have successfully decoupled the system into two independent equations:
$\begin{cases} u’ = 0 \\ v’ = -3v \end{cases}$
The solution are: u = c1, v = c2e-3t
$\begin{cases} u = x + 2y ~(i) \\ v=x−y \end{cases}$
From equation 2: v = x -y⇒[Solve for x] x = v + y
Now substitute x = v +y into Equation (i): u = (v+y) +2y ⇒[This simplifies to:] u = v + 3y ⇒[Solve for y] $y = \frac{u-v}{3}$
Now, that we have $y = \frac{u-v}{3}$, substitute this back into the expression for x, x = v + y = $v + \frac{u-v}{3} = \frac{u+2v}{3}$
In terms of x and y, x = $\frac{1}{3}(u+2v) = \frac{1}{3}(c_1+2c_2e^{-3t}), y = \frac{1}{3}(u-v) = \frac{1}{3}(c_1-c_2e^{-3t})$
So the general solution is $\vec{x} = \frac{1}{3}c_1(\begin{smallmatrix}1\\ 1\end{smallmatrix})+ \frac{1}{3}c_2(\begin{smallmatrix}2\\ -1\end{smallmatrix})e^{-3t}$
We aim to solve a system of linear differential equations of the form: $\vec{x’} = A\vec{x}$ where A is a constant 2 x 2 matrix, and $\vec{x} = (\begin{smallmatrix}x(t)\\ y(t)\end{smallmatrix})$ is the vector of unknown functions.
We seek a transformation of variables that “decouples” this system, transforming it into two independent equations that can be solved separately. This approach simplifies the problem significantly. The decoupled form we aim for is:
$\begin{cases} u’ = λ_1u \\ v’ = λ_2v \end{cases}$
where u(t) and v(t) are new variables that describe the system, and λ1 and λ2 are the eigenvalues of matrix A.
For the system to decouple:
Let E be the matrix whose columns are the eigenvectors of A: $E = (\vec{α_1}~\vec{α_2}) = (\begin{smallmatrix}a_1 & a_2\\ b_1 & b_2\end{smallmatrix})$
We introduce a change of variables: $\vec{x} = E\vec{u}$ where $\vec{u} = (\begin{smallmatrix}u\\ v\end{smallmatrix})$
Differentiating both sides with respect to t: $\vec{x’} = E\vec{u’}$.
Since $\vec{x’} = A\vec{x}$, we have: $E\vec{u’}= A\vec{x} = A(E\vec{u})$
Substituting AE = ED 🚀, where D is the diagonal matrix of eigenvalues ($D = (\begin{smallmatrix}λ_1 & 0\\ 0 & λ_2\end{smallmatrix})$): $E\vec{u’}= A(E\vec{u}) = (AE)\vec{u} = ED\vec{u} ↭ E\vec{u’}= ED\vec{u}$
🚀 $AE = A(\vec{α_1};\vec{α_2}) = (A\vec{α_1};A\vec{α_2})=[\text{Using the definition of eigenvectors:} (A-λ_1I)\vec{α_1} = \vec{0} ⇒ A\vec{α_1} = λ_1\vec{α_1}, A\vec{α_2} = λ_2\vec{α_2}] (λ_1\vec{α_1};λ_2\vec{α_2}), ED = (\vec{α_1};\vec{α_2})(\begin{smallmatrix}λ_1 & 0\\ 0 & λ_2\end{smallmatrix}) = (λ_1\vec{α_1};λ_2\vec{α_2})$. Since both AE and ED yield the same result, we have: AE=ED.
Simplify by multiplying both sides on the left by E-1: $E^{-1}E\vec{u’}= E^{-1}ED\vec{u} ⇒ \vec{u’} = D\vec{u}$
Since D is diagonal, $D = (\begin{smallmatrix}λ_1 & 0\\ 0 & λ_2\end{smallmatrix})$ this results in:
$\begin{cases} u’ = λ_1u \\ v’ = λ_2v \end{cases}$.
The decoupled equations can be solved independently:
Using the inverse transformation: $\vec{x} = E\vec{u} = (\vec{α_1};\vec{α_2})(\begin{smallmatrix}u(t)\\ v(t)\end{smallmatrix}) = (\vec{α_1};\vec{α_2})(\begin{smallmatrix}c_1e^{λ_1t}\\ c_2e^{λ_2t}\end{smallmatrix}) = c_1e^{λ_1t}\vec{α_1} + c_2e^{λ_2t}\vec{α_2}$
Therefore, the general solution to the original system is: $c_1e^{λ_1t}\vec{α_1} + c_2e^{λ_2t}\vec{α_2}$.
Let’s solve the system of differential equations:
$\begin{cases} x’ = -2x + 2y \\ y’ = x -y \end{cases}$
by transforming it into a decoupled system where each equation involves only one of the new variables. This makes solving the system straightforward.
Writing the System in Matrix Form
We can express the system in matrix form as: $\vec{x’} = A\vec{x}$ where A = $(\begin{smallmatrix}-2 & 2\\ 1 & -1\end{smallmatrix})$, $\vec{x} = (\begin{smallmatrix}x\\ y\end{smallmatrix})$
$(\begin{smallmatrix}x’\\ y’\end{smallmatrix}) = (\begin{smallmatrix}-2 & 2\\ 1 & -1\end{smallmatrix})(\begin{smallmatrix}x\\ y\end{smallmatrix})$
Finding Eigenvalues and Eigenvectors of A
The eigenvalues (λ) of matrix A are found by solving the characteristic equation: det(A −λI) = 0 where I is the identity matrix.
Compute A -λI = $(\begin{smallmatrix}-2-λ & 2\\ 1 & -1-λ\end{smallmatrix})$
Calculate the determinant: det(A −λI) = (−2 −λ)(−1 −λ) −(2)(1) = (2 +3λ +λ2 )−2= λ2 + 3λ = 0 with solutions (eigenvalues) λ1 = 0, λ2 = -3.
Finding Eigenvectors For λ1 = 0: Solve $(A - 0I)\vec{α_1} = 0$ ⇒$(\begin{smallmatrix}-2 & 2\\ 1 & -1\end{smallmatrix})(\begin{smallmatrix}a_1\\ b_1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\end{smallmatrix})$.
This yields the system: $\begin{cases} -2a_1 + 2b_1 = 0 \\ a_1 -b_1 = 0 \end{cases}$
-2a1 + 2b1 = 0, a1 -b1 = 0 ⇒ a1 = b1 ⇒ $\vec{α_1} = (\begin{smallmatrix}1\\ 1\end{smallmatrix})$. This means any scalar multiple of $(\begin{smallmatrix}1\\ 1\end{smallmatrix})$ is an eigenvector corresponding to λ1 = 0.
For λ2 = -3: Solve $(A - 3I)\vec{α_2} = 0$ ⇒$(\begin{smallmatrix}1 & 2\\ 1 & 2\end{smallmatrix})(\begin{smallmatrix}a_2\\ b_2\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\end{smallmatrix})$
This yields the system:
$\begin{cases} a_2 + 2b_2 = 0 \\ a_2 + 2b_2 = 0 \end{cases}$
Both equations are identical. Solving $a_2 = -2b_2$
Choosing b2 = 1 for simplicity. Thus, an eigenvector corresponding to λ2 = -3 is: $\vec{α_2} = (\begin{smallmatrix}a_2\\ b_2\end{smallmatrix}) = (\begin{smallmatrix}-2\\ 1\end{smallmatrix})$
Constructing the Matrix E
The matrix E is formed by placing the eigenvectors as columns: E = $(\vec{α_1};\vec{α_2}) = (\begin{smallmatrix}1 & -2\\ 1 & 1\end{smallmatrix})$
Verifying That A is Diagonalizable Since we have two linearly independent eigenvectors, A is diagonalizable.
AE = ED ⇒ $A = EDE^{-1}$ where D is the diagonal matrix of eigenvalues: $(\begin{smallmatrix}λ_1 & 0\\ 0 & λ_2\end{smallmatrix}) = (\begin{smallmatrix}0 & 0\\ 0 & -3\end{smallmatrix})$
Compute E-1
det(E) = $det(\begin{smallmatrix}1 & -2\\ 1 & 1\end{smallmatrix}) = 1·1−(−2)·1 = 1+2 =3.$
Then, $E^{-1} = \frac{1}{det(E)}·(\begin{smallmatrix}e_{22} & -e_{12}\\ -e_{21} & e_{11}\end{smallmatrix})= \frac{1}{3}(\begin{smallmatrix}1 & 2\\ -1 & 1\end{smallmatrix})$
Changing Variables
We introduce the variables: $\vec{u} = E^{-1}\vec{x} = \frac{1}{3}(\begin{smallmatrix}1 & 2\\ -1 & 1\end{smallmatrix})(\begin{smallmatrix}x\\ y\end{smallmatrix}) = (\begin{smallmatrix}u\\ v\end{smallmatrix})$
Thus, the transformation is: u = $\frac{1}{3}(x+2y)$, v = $\frac{1}{3}(-x+y)$
Expressing $\vec{x}$ in Terms of $\vec{u}$
$\vec{x} = E\vec{u} = (\begin{smallmatrix}1 & -2\\ 1 & 1\end{smallmatrix})(\begin{smallmatrix}u\\ v\end{smallmatrix})$
This results in: x = u −2v, y = u +v
Transforming the Original System
Substitute $\vec{x} = E\vec{u}, \vec{x’} = E\vec{u’}$ into $\vec{x’} = A\vec{x’}: E\vec{u’} = AE\vec{u}$.
Since AE = ED: $E\vec{u’} = ED\vec{u}$
Multiply both sides by E-1: $E^{-1}E\vec{u’} = E^{-1}ED\vec{u} ⇒\vec{u’} = D\vec{u} = (\begin{smallmatrix}0 & 0\\ 0 & -3\end{smallmatrix})\vec{u}↭ (\begin{smallmatrix}u’\\ v’\end{smallmatrix}) = (\begin{smallmatrix}0 & 0\\ 0 & -3\end{smallmatrix})(\begin{smallmatrix}u\\ v\end{smallmatrix})$.
This results in two decoupled equations:
$\begin{cases} u’ = 0 \\ v’ = -3v \end{cases}$
Now, solve each equation independently.
Expressing x(t) and y(t) in Terms of u(t) and v(t)
Using the relations: x = u −2v, y =u +v
Substitute the solutions for u(t) and v(t): $x(t) = c_1-2c_2e^{-3t}, y(t) = c_1+c_2e^{-3t}$
Writing the General Solution
Thus, the general solution to the system is:
$\vec{x}(t) = (\begin{smallmatrix}x(t)\\ y(t)\end{smallmatrix}) = c_1(\begin{smallmatrix}1\\ 1\end{smallmatrix})+c_2e^{-3t}(\begin{smallmatrix}-2\\ 1\end{smallmatrix})$.
Let’s solve the system of differential equations:
$\begin{cases} x’ = 2x + y \\ y’ = x + 2y \end{cases}$
by transforming it into a decoupled system where each equation involves only one of the new variables. This makes solving the system straightforward.
Writing the System in Matrix Form
We can express the system in matrix form as: $\vec{x’} = A\vec{x}$ where A = $(\begin{smallmatrix}2 & 1\\ 1 & 2\end{smallmatrix})$, $\vec{x} = (\begin{smallmatrix}x\\ y\end{smallmatrix})$
$(\begin{smallmatrix}x’\\ y’\end{smallmatrix}) = (\begin{smallmatrix}2 & 1\\ 1 & 2\end{smallmatrix})(\begin{smallmatrix}x\\ y\end{smallmatrix})$
Finding Eigenvalues and Eigenvectors of A
The eigenvalues (λ) of matrix A are found by solving the characteristic equation: det(A −λI) = 0 where I is the identity matrix.
Compute A -λI = $(\begin{smallmatrix}2-λ & 1\\ 1 & 2-λ\end{smallmatrix})$
Calculate the determinant: det(A −λI) = $(2-λ)(2-λ) -1 = 0 ↭ λ^2-4λ + 3 = 0↭ λ = \frac{4±\sqrt{16-12}}{2} = \frac{4 ± 2}{2}$. So the eigenvalues are λ1 = 3 and λ2 = 1.
Finding Eigenvectors For λ1 = 3: Solve $(A - 3I)\vec{α_1} = 0$ ⇒$(\begin{smallmatrix}-1 & 1\\ 1 & -1\end{smallmatrix})(\begin{smallmatrix}a_1\\ b_1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\end{smallmatrix})$.
This yields the system: $\begin{cases} -a_1 + b_1 = 0 \\ a_1 -b_1 = 0 \end{cases}$
So, a1 = b1. An eigenvector is $\vec{α_1}=(\begin{smallmatrix}1\\ 1\end{smallmatrix})$
For λ2 = 1: Solve $(A - I)\vec{α_2} = 0$ ⇒$(\begin{smallmatrix}1 & 1\\ 1 & 1\end{smallmatrix})(\begin{smallmatrix}a_1\\ b_1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\end{smallmatrix})$.
This yields the system: $\begin{cases} a_1 + b_1 = 0 \\ a_1 +b_1 = 0 \end{cases}$
So, $a_1 + b_1 = 0 ⇒ b_1 = -a_1.$ An eigenvector is $\vec{α_2}=(\begin{smallmatrix}1\\ -1\end{smallmatrix})$
Constructing the Matrix E
The matrix E is formed by placing the eigenvectors as columns: E = $(\vec{α_1};\vec{α_2}) = (\begin{smallmatrix}1 & 1\\ 1 & -1\end{smallmatrix})$
Verifying That A is Diagonalizable Since we have two linearly independent eigenvectors, A is diagonalizable.
AE = ED ⇒ $A = EDE^{-1}$ where D is the diagonal matrix of eigenvalues: $(\begin{smallmatrix}λ_1 & 0\\ 0 & λ_2\end{smallmatrix}) = (\begin{smallmatrix}3 & 0\\ 0 & 1\end{smallmatrix})$
Compute E-1
det(E) = $det(\begin{smallmatrix}1 & 1\\ 1 & -1\end{smallmatrix}) = 1·(-1)−1·1 = -2.$
Then, $E^{-1} = \frac{1}{det(E)}·(\begin{smallmatrix}e_{22} & -e_{12}\\ -e_{21} & e_{11}\end{smallmatrix})= \frac{1}{-2}(\begin{smallmatrix}-1 & -1\\ -1 & 1\end{smallmatrix}) = \frac{1}{2}(\begin{smallmatrix}1 & 1\\ 1 & -1\end{smallmatrix})$
Changing Variables We introduce the variables: $\vec{u} = E^{-1}\vec{x} = \frac{1}{2}(\begin{smallmatrix}1 & 1\\ 1 & -1\end{smallmatrix})(\begin{smallmatrix}x\\ y\end{smallmatrix}) = (\begin{smallmatrix}u\\ v\end{smallmatrix})$
Thus, the transformation is: u = $\frac{1}{2}(x+y)$, v = $\frac{1}{2}(x-y)$
Expressing $\vec{x}$ in Terms of $\vec{u}$
$\vec{x} = E\vec{u} = (\begin{smallmatrix}1 & 1\\ 1 & -1\end{smallmatrix})(\begin{smallmatrix}u\\ v\end{smallmatrix})$
This results in: x = u + v, y = u - v
Transforming the Original System
Substitute $\vec{x} = E\vec{u}, \vec{x’} = E\vec{u’}$ into $\vec{x’} = A\vec{x’}: E\vec{u’} = AE\vec{u}$.
Since AE = ED: $E\vec{u’} = ED\vec{u}$
Multiply both sides by E-1: $E^{-1}E\vec{u’} = E^{-1}ED\vec{u} ⇒\vec{u’} = D\vec{u} = (\begin{smallmatrix}3 & 0\\ 0 & 1\end{smallmatrix})\vec{u}↭ (\begin{smallmatrix}u’\\ v’\end{smallmatrix}) = (\begin{smallmatrix}3 & 0\\ 0 & 1\end{smallmatrix})(\begin{smallmatrix}u\\ v\end{smallmatrix})$.
This results in two decoupled equations:
$\begin{cases} u’ = 3u \\ v’ = v \end{cases}$
Now, solve each equation independently.
Expressing x(t) and y(t) in Terms of u(t) and v(t)
Using the relations: x = u + v, y = u -v
Substitute the solutions for u(t) and v(t): $x(t) = c_1e^{3t}+c_2e^{t}, y(t) = c_1e^{3t}-c_2e^{t}$
Writing the General Solution
Thus, the general solution to the system is:
$\vec{x}(t) = (\begin{smallmatrix}x(t)\\ y(t)\end{smallmatrix}) = c_1e^{3t}(\begin{smallmatrix}1\\ 1\end{smallmatrix})+c_2e^{t}(\begin{smallmatrix}1\\ -1\end{smallmatrix})$.