JustToThePoint English Website Version
JustToThePoint en español

Maximize your online presence with our exclusive offer: Get a stunning hero banner, the hero you need and deserve, at an unbeatable price! Bew, 689282782, bupparchard@gmail.com

Vector Spaces II

Quidquid latine dictum sit, altum sonatur, - Whatever is said in Latin sounds profound.

We will continue having lots of useless documentation to save our asses, tons of pointless meeting, and fruitless and endless discussions until we find out why no work is getting done and the code is unmaintainable. Meanwhile, I will keep using unrelated words and single letters in naming functions and variables, recycling variables and making as many of them static as I possibly can, documenting the obvious, but never ever writing a variable declaration or explaining what some section is attempting to accomplish, and avoiding testing like the plague, Apocalypse, Anawim, #justtothepoint.

Recall that a field is a commutative ring with unity such that each nonzero element has a multiplicative inverse, e.g., every finite integral domain, ℤp (p prime), ℚ, ℝ, and ℂ. A field has characteristic zero or characteristic p with p prime and F[x] is a field.

Definition. Let F be a field and V a set of two operations:

We say V is a vector space over F if (V, +) is an Abelian group under addition, and ∀a, b ∈ F, u, v ∈V the following conditions hold:

  1. a(v + u) = av + au, (a + b)v = av + bv
  2. a(bv) = (ab)v
  3. 1v = v.

Image 

Definition. Suppose V is a vector space over a field F and let S be a set of vectors from V, S = {v1, v2, ···, vn}. We say S is linearly dependent if there are elements from F, not all zero, α1, α2, ···, αn, such that they satisfy the following equation α1v1 + α2v2 + ··· + αnvn = 0 or alternatively, ∃v1, v2, ···, vn ∈ V, α1, α2, ···, αn∈ F such that α1v1 + α2v2 + ··· + αnvn = 0 We say S is linearly independent if α1v1 + α2v2 + ··· + αnvn = 0 ⇒ α1 = α2 = ··· = αn = 0.

Examples:

x$(\begin{smallmatrix}1\\ 2\end{smallmatrix}) + y(\begin{smallmatrix}3\\ -4\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\end{smallmatrix}) ⇒ (\begin{smallmatrix}x+3y\\ 2x-4y\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\end{smallmatrix})$ ⇒ x + 3y = 0, 2x - 4y = 0 ⇒ x = y = 0.

$-2(\begin{smallmatrix}1\\ 1\\ 0\end{smallmatrix}) -1(\begin{smallmatrix}0\\ 1\\ 1\end{smallmatrix}) + (\begin{smallmatrix}2\\ 3\\ 1\end{smallmatrix})=(\begin{smallmatrix}0\\ 0\\ 0\end{smallmatrix})$

Proposition. Suppose V is a vector space over a field F and let S be a set of vectors from V, S = {v1, v2, ···, vn}. {v1, v2, ···, vn} is linearly dependent if and only if one of the vi can be written as a linear combination of the others.

Proof.

⇒) Suppose {v1, v2, ···, vn} is linearly dependent ⇒ ∃α1, α2, ···, αn ∈ F not all zero, so ∃αi ≠ 0 for some 1 ≤ i ≤ n, such that α1v1 + α2v2 + ··· + αnvn = 0 ⇒ αivi = -(α1v1 + α2v2 + ··· αi-1vi-1 + αi+1vi+1 + ··· + αnvn) ⇒ [F field, αi ≠ 0, it has a multiplicative inverse] vi = (-α1αi-1)v1 (-α2αi-1)v2 + ··· (-αi-1αi-1)vi-1 (-αi+1αi-1)vi+1 + ··· (-αnαi-1)vn

⇐) Suppose vi = β1v12v2 + ··· βi-1vi-1i+1vi+1 + ··· +βnvn ⇒ β1v12v2 + ··· βi-1vi-1 -vii+1vi+1 + ··· +βnvn = 0, and one of the coefficient, namely -1 ≠ 0 ∎

0, 1 and -1 are necessary elements in every field. 0 and 1 are the addition and multiplication identities respectively, and -1 is the additive inverse of 1 in F. However, those elements can be the same, e.g., -1 = 1 in ℤ2.

Definition. A collection of vectors B ⊆ V is called a basis for V if B is linearly independent and span{v1, v2, ···, vn} = {α1v1 + α2v2 + ··· + αnvn| αi ∈ F} = V.

Proposition. B is a basis of V if and only if every vector v ∈ V can be written uniquely as a linear combination of elements of the basis.

Proof.

⇒) Let’s take v ∈ V = span{v1, v2, ···, vn} = {α1v1 + α2v2 + ··· + αnvn| αi ∈ F} = V, and suppose v = α1v1 + α2v2 + ··· + αnvn = β1v1 + β2v2 + ··· + βnvn where αi, βi ∈ F, vi ∈ B. If there were different elements of V in both expressions, we can complete them by giving them zero-weights (αl = 0) ⇒ (α11)v1 + (α22)v2 + ··· + (αnn)vn = 0 ⇒ [B is basis for V, so B is linearly independent] αii = 0, 1 ≤ i ≤ n ⇒ αi = βi = 0, 1 ≤ i ≤ n∎

⇐) Every vector v ∈ V can be written uniquely as a linear combination of elements of the basis ⇒ V = span{v1, v2, ···, vn} = {α1v1 + α2v2 + ··· + αnvn| αi ∈ F}

Let’s prove that B is linearly independent.

α1v1 + α2v2 + ··· + αnvn = 0 = 0v1 + 0v2 + ··· +0vn ⇒ [By assumption, every vector v ∈ V can be written uniquely as a linear combination of elements of the basis] βi = 0, 1 ≤ i ≤ n∎

Examples:

  1. Linearly independent. Suppose ∃a, b ∈ ℝ s.t. $a(\begin{smallmatrix}1 & 1\\ 1 & 0\end{smallmatrix}) + b(\begin{smallmatrix}0 & 1\\ 1 & 1\end{smallmatrix}) = (\begin{smallmatrix}0 & 0\\ 0 & 0\end{smallmatrix}) ⇒ (\begin{smallmatrix}a & a+b\\ a+b & b\end{smallmatrix}) = (\begin{smallmatrix}0 & 0\\ 0 & 0\end{smallmatrix})$ ⇒ a = b = 0.
  2. $(\begin{smallmatrix}a & a+b\\ a+b & b\end{smallmatrix}) = a(\begin{smallmatrix}1 & 1\\ 1 & 0\end{smallmatrix}) + b(\begin{smallmatrix}0 & 1\\ 1 & 1\end{smallmatrix})$ ⇒ B spans V.

Definition. Suppose V has a basis B = span{v1, v2, ···, vn} = {α1v1 + α2v2 + ··· + αnvn| αi ∈ F}. Let’s define the coordinate map, γB: V → Fn, γB(v) = $(\begin{smallmatrix}α_1\\ α_2\\ ··· \\α_n\end{smallmatrix})$ if v = α1v1 + α2v2 + ··· + αnvn

Examples.

  1. Linear independence. $x(\begin{smallmatrix}2\\ 3\end{smallmatrix}) + y(\begin{smallmatrix}1\\ -1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\end{smallmatrix})$ ⇒ 2x + y = 0, 3x -y = 0 ⇒ x = y = 0.
  2. v = $(\begin{smallmatrix}a\\ b\end{smallmatrix})$ ∈ ℝ2, $x(\begin{smallmatrix}2\\ 3\end{smallmatrix}) + y(\begin{smallmatrix}1\\ -1\end{smallmatrix}) = (\begin{smallmatrix}a\\ b\end{smallmatrix})$ ⇒ 2x + y = a, 3x -y = b ⇒ x = (a+b)/5, y = (3/5)a -(2/5)b.

Theorem. Suppose V = span{v1, v2, ···, vn} and {w1, w2, ···, wm} is linearly independent. Then, m ≤ n.

Proof.

w1 ∈ V = span{v1, v2, ···, vn} ⇒ w1 = α1v1 + α2v2 + ··· + αnvn, αi ∈ F and not all αi≠0 because if all αi were zero, then w1 = 0 ⇒ {w1, w2, ···, wm} is lineal dependent.

Rename v1, v2, ···, vn so that α1 ≠ 0 ⇒ v1 = -α1-12v2 + α3v3 + ··· + αnvn - w1) ∈ span{w1, v2, ···, vn} ⇒ V = span{w1, v2, ···, vn}

Let’s repeat the process, write w2 = β1w1 + β2v2 + ··· + βnvn, where β2,··· βn not all zero because otherwise, w2 = β1w1 and then {w1, w2, ···, wm} is not linearly dependent.

Rename or reorder, so that β2 ≠ 0, and v2 = -β2-11w1 + β3v3 + ··· + βnvn -w2) ∈ span{w1, w2, v3 ···, vn} ⇒ V = span{w1, w2, v3 ···, vn}

Let’s continue this process until we “use” all wi ⇒ m ≤ n. V = spam{w1,···, wm, vm+1, ···, vn}

Definition. If B = {v1, v2, ···, vn} is a basis for V, V is said to have dimension n.

Proposition. The notion of dimension is well defined. In other words, every basis has the same number of vectors., e.g., {1, i} is a basis for ℂ over ℝ, and the dimension is two.

Proof. Suppose B = {v1, v2, ···, vn} and B’ = {w1, w2, ···, wm} are basis for V.

B spans and B’ is linear independence ⇒ m ≤ n, but B’ spans and B is linear independence ⇒ n ≤ m. Therefore, m = n ∎.

Examples

Vector space Basis Dimensions
{0} 0
Fn e1, e2, ···, en n
F[x] 1, x, x2, x3, ··· +∞
$\wp_n$ 1, x, x2, x3, ···, xn n + 1
ℂ(ℝ) unlistable +∞

x1 + x2 + x3 = 0 ⇒ x1 = - x2 - x3 ⇒ $v=(\begin{smallmatrix}-x_2-x_3\\ x_2\\x_3\end{smallmatrix}) = x_2(\begin{smallmatrix}-1\\ 1\\0\end{smallmatrix})+x_3(\begin{smallmatrix}-1\\ 0\\1\end{smallmatrix})$, for some x2, x3 ∈ ℝ. Therefore, V = span{$(\begin{smallmatrix}-1\\ 1\\0\end{smallmatrix}), (\begin{smallmatrix}-1\\ 0\\1\end{smallmatrix})$}

We only need to prove that these two vectors are lineal independent. Suppose α, β ∈ ℝ,

$α(\begin{smallmatrix}-1\\ 1\\ 0\end{smallmatrix})+β(\begin{smallmatrix}-1\\ 0\\ 1\end{smallmatrix}) = (\begin{smallmatrix}0\\ 0\\ 0\end{smallmatrix})$ ⇒ -α-β=0, α=0, β=0, ∎. Therefore, we have a basis and dim(V) = 2.

Let an arbitrary f(x) ∈ {f(x) ∈ ℝ[x]| f(3)=0 and deg(f(x) ≤ 2)} ⊆ $\wp_2$ ⇒ f(3) = 0 ⇒ f(x) = (x-3)g(x) where g(x) is a polynomial of degree at most 1.

  1. deg(f(x)) = 0 ⇒[f is arbitrary] f(x) = c ≠ 0 ⇒f(3) ≠ 0 ⊥
  2. deg(f(x)) = 1 ⇒ deg(g(x)) = 0 ⇒ f(x) = c(x-3)
  3. deg(f(x)) = 2 ⇒ deg(g(x)) = 1 ⇒ f(x) = (ax + b)(x - 3). This last form is the most general case of f(x), and a = 0 is a particular case (2).

f(x) = (ax + b)(x - 3) = ax(x -3) + b(x-3) = span{x(x-3), x-3}

{f(x) ∈ ℝ[x]| f(3)=0 and deg(f(x) ≤ 2)} = span{x(x-3), x-3}, so we only need to show that these two polynomials are linearly independent.

ax(x - 3) + b(x -3) = 0. If x = 0 ⇒ b(-3) = 0 ⇒ b = 0.

ax(x -3) = 0 ∀x ∈ ℝ. In particular, x = 1, a(-2) = 0 ⇒ a = 0. Therefore, it is a basis, and its dimension is equal to two.

Bibliography

This content is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. This post relies heavily on the following resources, specially on NPTEL-NOC IITM, Introduction to Galois Theory, Michael Penn, and Contemporary Abstract Algebra, Joseph, A. Gallian.
  1. NPTEL-NOC IITM, Introduction to Galois Theory.
  2. Algebra, Second Edition, by Michael Artin.
  3. LibreTexts, Abstract and Geometric Algebra, Abstract Algebra: Theory and Applications (Judson).
  4. Field and Galois Theory, by Patrick Morandi. Springer.
  5. Michael Penn (Abstract Algebra), and MathMajor.
  6. Contemporary Abstract Algebra, Joseph, A. Gallian.
  7. Andrew Misseldine: College Algebra and Abstract Algebra.
Bitcoin donation

JustToThePoint Copyright © 2011 - 2024 Anawim. ALL RIGHTS RESERVED. Bilingual e-books, articles, and videos to help your child and your entire family succeed, develop a healthy lifestyle, and have a lot of fun. Social Issues, Join us.

This website uses cookies to improve your navigation experience.
By continuing, you are consenting to our use of cookies, in accordance with our Cookies Policy and Website Terms and Conditions of use.