Chapter 3: Vector Spaces | MTS203 Linear Algebra
Leon 8th Ed.

Chapter 3: Vector Spaces

Definition & examples · Subspaces · Linear independence · Basis & dimension · Row & column space

The operations of addition and scalar multiplication appear in $\mathbb{R}^n$, in polynomials, in matrices, in continuous functions. A vector space unifies all of these with 8 axioms. Every theorem you prove for an abstract vector space holds simultaneously in all of these settings.

Section 3.1

Definition & Examples

Definition — Vector Space
A set $V$ with addition and scalar multiplication is a vector space if all 8 axioms hold:
  1. Closure under $+$
  2. Closure under $\cdot$
  3. Commutativity: $\mathbf{u}+\mathbf{v}=\mathbf{v}+\mathbf{u}$
  4. Associativity: $(\mathbf{u}+\mathbf{v})+\mathbf{w}=\mathbf{u}+(\mathbf{v}+\mathbf{w})$
  5. Zero vector: $\mathbf{v}+\mathbf{0}=\mathbf{v}$
  6. Additive inverse: $\mathbf{v}+(-\mathbf{v})=\mathbf{0}$
  7. Distributivity: $\alpha(\mathbf{u}+\mathbf{v})=\alpha\mathbf{u}+\alpha\mathbf{v}$
  8. Scalar identity: $1\cdot\mathbf{v}=\mathbf{v}$
Vector SpaceElements$\dim$Zero vector
$\mathbb{R}^n$$n$-tuples of reals$n$$(0,\ldots,0)$
$\mathbb{R}^{m\times n}$$m\times n$ matrices$mn$Zero matrix $O$
$P_n$Polynomials of degree $< n$$n$$p(x)\equiv 0$
$C[a,b]$Continuous functions on $[a,b]$$\infty$$f(x)\equiv 0$
$\{\mathbf{0}\}$Just the zero vector$0$$\mathbf{0}$
Vector addition in $\mathbb{R}^2$: the parallelogram law. Drag either arrowhead. The sum $\mathbf{u}+\mathbf{v}$ (purple) is the diagonal of the parallelogram.
Section 3.2

Subspaces

Definition — Subspace
A nonempty $S \subseteq V$ is a subspace iff it is closed under both operations:
  • $\alpha\mathbf{x} \in S$ whenever $\mathbf{x} \in S$ (closed under scalar multiplication)
  • $\mathbf{x}+\mathbf{y} \in S$ whenever $\mathbf{x},\mathbf{y} \in S$ (closed under addition)
Equivalent one-line test: $\alpha\mathbf{x}+\beta\mathbf{y} \in S$ for all $\mathbf{x},\mathbf{y}\in S$ and all scalars $\alpha,\beta$.
Quick 3-Step Subspace Test
  1. Does $S$ contain $\mathbf{0}$? (If not — stop, it's NOT a subspace.)
  2. Closed under scalar multiplication? (Take arbitrary $\mathbf{x}\in S$, check $\alpha\mathbf{x}\in S$.)
  3. Closed under addition? (Take arbitrary $\mathbf{x},\mathbf{y}\in S$, check $\mathbf{x}+\mathbf{y}\in S$.)
📘 Example 3.1 — Subspace (Skew-Symmetric Matrices)
Let $S = \{A \in \mathbb{R}^{2\times2} \mid a_{12} = -a_{21}\}$.

Zero: $O \in S$ since $0 = -0$. ✓

Scalar mult: If $A \in S$, then $(\alpha A)_{12} = \alpha a_{12} = -\alpha a_{21} = -(\alpha A)_{21}$. ✓

Addition: If $A,B \in S$, then $(A+B)_{12} = a_{12}+b_{12} = -a_{21}-b_{21} = -(A+B)_{21}$. ✓

$S$ is a subspace. (It's the set of $2\times2$ matrices with $a_{12}=-a_{21}$, a 3-dimensional subspace of $\mathbb{R}^{2\times2}$.)
📘 Example 3.2 — NOT a Subspace (Fails Zero Test)
$S = \{(x_1, x_2)^T \mid x_1 + x_2 = 1\}$.

The zero vector $(0,0)^T$ has $0+0=0 \neq 1$, so $\mathbf{0} \notin S$.

Not a subspace. (It's a line not passing through the origin — translated, not a true subspace.)

Fundamental Subspaces of a Matrix

SubspaceSymbolDefinitionLives in
Null space$N(A)$$\{\mathbf{x} \mid A\mathbf{x}=\mathbf{0}\}$$\mathbb{R}^n$
Column space$R(A)$$\{A\mathbf{x} \mid \mathbf{x}\in\mathbb{R}^n\}$$\mathbb{R}^m$
Row spaceSpan of rows of $A$$\mathbb{R}^n$
Left null space$N(A^T)$$\{\mathbf{y} \mid A^T\mathbf{y}=\mathbf{0}\}$$\mathbb{R}^m$
Section 3.3

Linear Independence

Definition
Vectors $\mathbf{v}_1, \ldots, \mathbf{v}_n \in V$ are linearly independent if $$c_1\mathbf{v}_1 + \cdots + c_n\mathbf{v}_n = \mathbf{0} \implies c_1 = \cdots = c_n = 0.$$ They are linearly dependent if some nontrivial combination equals $\mathbf{0}$ — meaning at least one vector is redundant (a linear combination of the others).
Left: two independent vectors — neither is a multiple of the other, they span a plane. Right: three dependent vectors — the third lies in the span of the first two.
Test for Linear Independence
Form the matrix $M = [\mathbf{v}_1 \; \mathbf{v}_2 \; \cdots \; \mathbf{v}_n]$ and row reduce. The vectors are linearly independent iff there are no free variables — every column is a pivot column.
📘 Example 3.3 — Testing Linear Independence
$\mathbf{v}_1=(1,0,2)^T$, $\mathbf{v}_2=(0,1,1)^T$, $\mathbf{v}_3=(2,1,5)^T$. $$\begin{pmatrix}1&0&2\\0&1&1\\2&1&5\end{pmatrix} \xrightarrow{R_3-2R_1-R_2} \begin{pmatrix}1&0&2\\0&1&1\\0&0&0\end{pmatrix}$$ Free variable exists. Dependency: $\mathbf{v}_3 = 2\mathbf{v}_1 + \mathbf{v}_2$. Linearly dependent. $\{v_1, v_2\}$ alone is an independent set spanning the same subspace.
Section 3.4

Basis & Dimension

Definition — Basis and Dimension
$\{\mathbf{v}_1,\ldots,\mathbf{v}_n\}$ is a basis for $V$ iff:
  • The vectors are linearly independent, AND
  • They span $V$ (every element of $V$ = linear combination of them)
A basis is a minimal spanning set — remove any vector and it no longer spans; add any vector and it's no longer independent. All bases of $V$ have the same number of vectors: $\dim(V)$.

Rank-Nullity Theorem

Rank-Nullity Theorem (Theorem 3.4.x)
For any $m\times n$ matrix $A$: $$\underbrace{\dim(R(A))}_{\text{rank}(A)} + \underbrace{\dim(N(A))}_{\text{nullity}(A)} = n$$ Pivot columns + free columns = total columns. Always.
📘 Example 3.4 — Basis for Null Space
$$A = \begin{pmatrix}1&1&1&0\\2&1&0&1\end{pmatrix}$$ RREF: $\begin{pmatrix}1&0&-1&1\\0&1&2&-1\end{pmatrix}$. Free variables: $x_3=\alpha$, $x_4=\beta$. $$\mathbf{x} = \alpha\begin{pmatrix}1\\-2\\1\\0\end{pmatrix} + \beta\begin{pmatrix}-1\\1\\0\\1\end{pmatrix}$$ Basis for $N(A)$: $\{(1,-2,1,0)^T, (-1,1,0,1)^T\}$. Check: rank$(A)=2$, nullity$(A)=2$, $2+2=4=n$. ✓
Section 3.5

Change of Basis

Every vector has coordinates depending on which basis you use. The transition matrix $S$ converts coordinates from one basis to another — critical for understanding similarity in Chapter 4 and diagonalization in Chapter 6.

Transition Matrix
If $E = \{\mathbf{v}_1,\ldots,\mathbf{v}_n\}$ and $F = \{\mathbf{w}_1,\ldots,\mathbf{w}_n\}$ are ordered bases, the transition matrix $S$ (from $E$ to $F$) has $j$-th column $= [\mathbf{v}_j]_F$. Then: $$[\mathbf{x}]_F = S^{-1}[\mathbf{x}]_E$$
Section 3.6

Row Space & Column Space

SubspaceHow to find a basisKey note
Row space of $A$Nonzero rows of any REF of $A$Row ops preserve row space
Column space of $A$Pivot columns of the original $A$Do NOT use RREF columns
Null space of $A$Parameterize free variables in RREF$\dim = n - \text{rank}$
Row Space vs Column Space
$\text{rank}(A) = \dim(\text{row space}) = \dim(\text{column space})$ — dimensions agree. But the spaces live in different $\mathbb{R}^k$'s. Also: $\text{rank}(A) = \text{rank}(A^T)$ — the row rank always equals the column rank.
📘 Example 3.5 — All Four Fundamental Subspaces
$$A = \begin{pmatrix}1&2&3\\2&4&6\\1&2&4\end{pmatrix}, \quad \text{RREF} = \begin{pmatrix}1&2&0\\0&0&1\\0&0&0\end{pmatrix}$$
  • Column space basis: pivot columns 1 and 3 of original $A$: $\{(1,2,1)^T, (3,6,4)^T\}$
  • Row space basis: nonzero rows of RREF: $\{(1,2,0), (0,0,1)\}$
  • Null space: $x_2=t$ free, $x_1=-2t$, $x_3=0$ → basis $\{(-2,1,0)^T\}$
  • Left null space: null space of $A^T$ → basis $\{(2,-1,0)^T, \ldots\}$
rank = 2, nullity = 1, $2+1=3=n$. ✓
Scroll to Top