Quidquid latine dictum sit, altum sonatur, - Whatever is said in Latin sounds profound.
We will continue having lots of useless documentation to save our asses, tons of pointless meeting, and fruitless and endless discussions until we find out why no work is getting done and the code is unmaintainable. Meanwhile, I will keep using unrelated words and single letters in naming functions and variables, recycling variables and making as many of them static as I possibly can, documenting the obvious, but never ever writing a variable declaration or explaining what some section is attempting to accomplish, and avoiding testing like the plague, Apocalypse, Anawim, #justtothepoint.
Recall that a field is a commutative ring with unity such that each nonzero element has a multiplicative inverse, e.g., every finite integral domain, ℤ_{p} (p prime), ℚ, ℝ, and ℂ. A field has characteristic zero or characteristic p with p prime and F[x] is a field.
Definition. Let F be a field and V a set of two operations:
We say V is a vector space over F if (V, +) is an Abelian group under addition, and ∀a, b ∈ F, u, v ∈V the following conditions hold:
Definition. Suppose V is a vector space over a field F and let S be a set of vectors from V, S = {v_{1}, v_{2}, ···, v_{n}}. We say S is linearly dependent if there are elements from F, not all zero, α_{1}, α_{2}, ···, α_{n}, such that they satisfy the following equation α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = 0 or alternatively, ∃v_{1}, v_{2}, ···, v_{n} ∈ V, α_{1}, α_{2}, ···, α_{n}∈ F such that α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = 0 We say S is linearly independent if α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = 0 ⇒ α_{1} = α_{2} = ··· = α_{n} = 0.
Examples:
x$(\begin{smallmatrix}1\\ 2\end{smallmatrix}) + y(\begin{smallmatrix}3\\ -4\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\end{smallmatrix}) ⇒ (\begin{smallmatrix}x+3y\\ 2x-4y\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\end{smallmatrix})$ ⇒ x + 3y = 0, 2x - 4y = 0 ⇒ x = y = 0.
ℝ^{3}, S = {(1, 0, 0), (1, 0, 1), (1, 1, 1)} are linearly independent because a(1, 0, 0) + b(1, 0, 1) + c(1, 1, 1) = (a + b + c, c, b + c) = (0, 0, 0) ⇒ a = b = c = 0.
ℝ^{3}, S = {(1, 2, 1), (0, 1, 2), (1, 1, 0)} are linearly independent because a(1, 2, 1) + b(0, 1, 2) + c(1, 1, 0) = (a + c, 2a + b + c, a + 2b) = (0, 0, 0) ⇒ a = b = c = 0.
ℝ^{3}, S = {(1, 1, 0), (0, 1, 1), (2, 3, 1)} are linearly dependent, a = -2, b = -1, c = 1.
$-2(\begin{smallmatrix}1\\ 1\\ 0\end{smallmatrix}) -1(\begin{smallmatrix}0\\ 1\\ 1\end{smallmatrix}) + (\begin{smallmatrix}2\\ 3\\ 1\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\\ 0\end{smallmatrix})$
Proposition. Suppose V is a vector space over a field F and let S be a set of vectors from V, S = {v_{1}, v_{2}, ···, v_{n}}. {v_{1}, v_{2}, ···, v_{n}} is linearly dependent if and only if one of the v_{i} can be written as a linear combination of the others.
Proof.
⇒) Suppose {v_{1}, v_{2}, ···, v_{n}} is linearly dependent ⇒ ∃α_{1}, α_{2}, ···, α_{n} ∈ F not all zero, so ∃α_{i} ≠ 0 for some 1 ≤ i ≤ n, such that α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = 0 ⇒ α_{i}v_{i} = -(α_{1}v_{1} + α_{2}v_{2} + ··· α_{i-1}v_{i-1} + α_{i+1}v_{i+1} + ··· + α_{n}v_{n}) ⇒ [F field, α_{i} ≠ 0, it has a multiplicative inverse] v_{i} = (-α_{1}α_{i}^{-1})v_{1} (-α_{2}α_{i}^{-1})v_{2} + ··· (-α_{i-1}α_{i}^{-1})v_{i-1} (-α_{i+1}α_{i}^{-1})v_{i+1} + ··· (-α_{n}α_{i}^{-1})v_{n} ∎
⇐) Suppose v_{i} = β_{1}v_{1} +β_{2}v_{2} + ··· β_{i-1}v_{i-1} +β_{i+1}v_{i+1} + ··· +β_{n}v_{n} ⇒ β_{1}v_{1} +β_{2}v_{2} + ··· β_{i-1}v_{i-1} -v_{i} +β_{i+1}v_{i+1} + ··· +β_{n}v_{n} = 0, and one of the coefficient, namely -1 ≠ 0 ∎
0, 1 and -1 are necessary elements in every field. 0 and 1 are the addition and multiplication identities respectively, and -1 is the additive inverse of 1 in F. However, those elements can be the same, e.g., -1 = 1 in ℤ_{2}.
Definition. A collection of vectors B ⊆ V is called a basis for V if B is linearly independent and span{v_{1}, v_{2}, ···, v_{n}} = {α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}| α_{i} ∈ F} = V.
Proposition. B is a basis of V if and only if every vector v ∈ V can be written uniquely as a linear combination of elements of the basis.
Proof.
⇒) Let’s take v ∈ V = span{v_{1}, v_{2}, ···, v_{n}} = {α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}| α_{i} ∈ F} = V, and suppose v = α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = β_{1}v_{1} + β_{2}v_{2} + ··· + β_{n}v_{n} where α_{i}, β_{i} ∈ F, v_{i} ∈ B. If there were different elements of V in both expressions, we can complete them by giving them zero-weights (α_{l} = 0) ⇒ (α_{1}-β_{1})v_{1} + (α_{2}-β_{2})v_{2} + ··· + (α_{n}-β_{n})v_{n} = 0 ⇒ [B is basis for V, so B is linearly independent] α_{i}-β_{i} = 0, 1 ≤ i ≤ n ⇒ α_{i} = β_{i} = 0, 1 ≤ i ≤ n∎
⇐) Every vector v ∈ V can be written uniquely as a linear combination of elements of the basis ⇒ V = span{v_{1}, v_{2}, ···, v_{n}} = {α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}| α_{i} ∈ F}
Let’s prove that B is linearly independent.
α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n} = 0 = 0v_{1} + 0v_{2} + ··· +0v_{n} ⇒ [By assumption, every vector v ∈ V can be written uniquely as a linear combination of elements of the basis] β_{i} = 0, 1 ≤ i ≤ n∎
Examples:
Definition. Suppose V has a basis B = span{v_{1}, v_{2}, ···, v_{n}} = {α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}| α_{i} ∈ F}. Let’s define the coordinate map, γ_{B}: V → F^{n}, γ_{B}(v) = $(\begin{smallmatrix}α_1\\ α_2\\ ··· \\α_n\end{smallmatrix})$ if v = α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}
Examples.
{1, x, x^{2}} is the standard basis of F_{2}[x] = $\wp_2$ (all polynomials of degree less or equal than two) γ_{B}: $\wp_2$ → F^{3}, γ_{B}(α_{0} + α_{1}x + α_{2}x_{2}) = $(\begin{smallmatrix}α_0\\ α_1\\α_2\end{smallmatrix})$.
Is B = {$(\begin{smallmatrix}2\\ 3\end{smallmatrix}), (\begin{smallmatrix}1\\ -1\end{smallmatrix})$} a basis of ℝ^{2}?
Theorem. Suppose V = span{v_{1}, v_{2}, ···, v_{n}} and {w_{1}, w_{2}, ···, w_{m}} is linearly independent. Then, m ≤ n.
Proof.
w_{1} ∈ V = span{v_{1}, v_{2}, ···, v_{n}} ⇒ w_{1} = α_{1}v_{1} + α_{2}v_{2} + ··· + α_{n}v_{n}, α_{i} ∈ F and not all α_{i}≠0 because if all α_{i} were zero, then w_{1} = 0 ⇒ {w_{1}, w_{2}, ···, w_{m}} is lineal dependent.
Rename v_{1}, v_{2}, ···, v_{n} so that α_{1} ≠ 0 ⇒ v_{1} = -α_{1}^{-1}(α_{2}v_{2} + α_{3}v_{3} + ··· + α_{n}v_{n} - w_{1}) ∈ span{w_{1}, v_{2}, ···, v_{n}} ⇒ V = span{w_{1}, v_{2}, ···, v_{n}}
Let’s repeat the process, write w_{2} = β_{1}w_{1} + β_{2}v_{2} + ··· + β_{n}v_{n}, where β_{2},··· β_{n} not all zero because otherwise, w_{2} = β_{1}w_{1} and then {w_{1}, w_{2}, ···, w_{m}} is not linearly dependent.
Rename or reorder, so that β_{2} ≠ 0, and v_{2} = -β_{2}^{-1}(β_{1}w_{1} + β_{3}v_{3} + ··· + β_{n}v_{n} -w_{2}) ∈ span{w_{1}, w_{2}, v_{3} ···, v_{n}} ⇒ V = span{w_{1}, w_{2}, v_{3} ···, v_{n}}
Let’s continue this process until we “use” all w_{i} ⇒ m ≤ n. V = spam{w_{1},···, w_{m}, v_{m+1}, ···, v_{n}}
Definition. If B = {v_{1}, v_{2}, ···, v_{n}} is a basis for V, V is said to have dimension n.
Proposition. The notion of dimension is well defined. In other words, every basis has the same number of vectors., e.g., {1, i} is a basis for ℂ over ℝ, and the dimension is two.
Proof. Suppose B = {v_{1}, v_{2}, ···, v_{n}} and B’ = {w_{1}, w_{2}, ···, w_{m}} are basis for V.
B spans and B’ is linear independence ⇒ m ≤ n, but B’ spans and B is linear independence ⇒ n ≤ m. Therefore, m = n ∎.
Vector space | Basis | Dimensions | ||
---|---|---|---|---|
{0} | ∅ | 0 | ||
F^{n} | e_{1}, e_{2}, ···, e_{n} | n | ||
F[x] | 1, x, x^{2}, x^{3}, ··· | +∞ | ||
$\wp_n$ | 1, x, x^{2}, x^{3}, ···, x^{n} | n + 1 | ||
ℂ(ℝ) | unlistable | +∞ |
x_{1} + x_{2} + x_{3} = 0 ⇒ x_{1} = - x_{2} - x_{3} ⇒ $v=(\begin{smallmatrix}-x_2-x_3\\ x_2\\x_3\end{smallmatrix}) = x_2(\begin{smallmatrix}-1\\ 1\\0\end{smallmatrix})+x_3(\begin{smallmatrix}-1\\ 0\\1\end{smallmatrix})$, for some x_{2}, x_{3} ∈ ℝ. Therefore, V = span{$(\begin{smallmatrix}-1\\ 1\\0\end{smallmatrix}), (\begin{smallmatrix}-1\\ 0\\1\end{smallmatrix})$}
We only need to prove that these two vectors are lineal independent. Suppose α, β ∈ ℝ,
$α(\begin{smallmatrix}-1\\ 1\\ 0\end{smallmatrix})+β(\begin{smallmatrix}-1\\ 0\\ 1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\\ 0\end{smallmatrix})$ ⇒ -α-β=0, α=0, β=0, ∎. Therefore, we have a basis and dim(V) = 2.
Let an arbitrary f(x) ∈ {f(x) ∈ ℝ[x]| f(3)=0 and deg(f(x) ≤ 2)} ⊆ $\wp_2$ ⇒ f(3) = 0 ⇒ f(x) = (x-3)g(x) where g(x) is a polynomial of degree at most 1.
f(x) = (ax + b)(x - 3) = ax(x -3) + b(x-3) = span{x(x-3), x-3}
{f(x) ∈ ℝ[x]| f(3)=0 and deg(f(x) ≤ 2)} = span{x(x-3), x-3}, so we only need to show that these two polynomials are linearly independent.
ax(x - 3) + b(x -3) = 0. If x = 0 ⇒ b(-3) = 0 ⇒ b = 0.
ax(x -3) = 0 ∀x ∈ ℝ. In particular, x = 1, a(-2) = 0 ⇒ a = 0. Therefore, it is a basis, and its dimension is equal to two.