Thursday, January 17, 2013 CC-BY-NC
Dimension

Maintainer: admin

Material covered: the definition of the dimension of a vector space, the lunch in Chinatown identity1.

1Proposition 2.13

Suppose $V$ is a finite-dimensional vector space, of which $U$ is a subspace. Then, there exists a subspace $W \subset V$ such that $V = U \oplus W$.

Proof: Let $B = (u_1, \ldots, u_n)$ be a basis for $U$. Since each of the vectors are in $V$, and they're linearly independent in $U$, they are also linearly independent in $V$; thus, by theorem 2.12, we can extend $B$ to a basis of $V$. Let $(u_1, \ldots, u_n, w_1, \ldots, w_m)$ be a basis for $V$. Then, $(w_1, \ldots, w_m)$ is a set of vectors whose span is a subspace, which we will call $W$.

Now, we need to show that $U \oplus W = V$. First, we need to show that $U + W = V$, which we can do by proving that $V \subseteq U + W$ and that $U + W \subseteq V$. For the former, let $v \in V$, which we can write as a linear combination of basis vectors in $B$:

$$v = \underbrace{a_1 u_1 + \ldots + a_nu_n}_{\in U} + \underbrace{b_1w_1 + \ldots + b_mw_m}_{\in W} = u + w, \; u \in U, w \in W$$

So this tells us that any vector $V$ can be expressed as the sum of vectors in $U$ and $W$, and so $V \subseteq U + W$. The other direction is trivial: if $u \in U$, $u = a_1u_1 + \ldots + a_nu_n = a_1u_1 + \ldots + a_nu_n + 0w_1 + \ldots + 0w_m \in V$, and if $w \in W$, then $w = b_1w_1 + \ldots + b_mw_m = 0u_1 + ldots + 0u_n + b_1w_1 + \ldots + b_mw_m \in V$. So we've shown that $V = U + W$.

To conclude the proof, we need to show that $U \cap W = \{0\}$. Let $x \in U \cap W$ be some vector in their intersection. We will proceed to show that $x$ must be the zero vector. Since $x \in U$, we can write it as a linear combination of the basis vectors in $U$: $x = a_1u_1 + \ldots + a_nu_n$. Similarly, since $x \in W$, we can write it as a linear combination of the basis vectors in $W$: $x = b_1w_1 + \ldots + b_mw_m$. We can then combine these two representations by subtracting one from the other, as follows:

$$0 = x-x = (a_1u_1 + \ldots + a_nu_n) - (b_1w_1 + \ldots + b_mw_m) = a_1u_1 + a_nu_n + c_1w_1 + \ldots c_mw_m$$

(where $c_i = - b_i$ is just some coefficient in the field). However, since the basis $B$ is linearly independent in $V$, we know that the coefficients above must all be zero. So we have that $x = 0$, which tells us that the only vector in the intersection of $U$ and $W$ is the zero vector. This concludes the proof that $V = U \oplus W$. $\blacksquare$

2Theorem 2.14

Any 2 bases of a finite-dimension vector space $V$ are the same size (i.e., contain the same number of vectors).

Proof: The proof given for this is quite simple and relies heavily on theorem 2.6. Let $B_1$ and $B_2$ be two bases of $V$, where $B_1$ contains $m$ vectors and $B_2$ contains $n$ vectors. Since $B_1$ spans $V$, and since $B_2$ is a linearly independent set, we know that $m \geq n$. Similarly, since $B_2$ spans $V$, and since $B_1$ is a linearly independent set, we know that $n \geq m$. So $m = n$ and thus any two bases in a vector space $V$ contain the same number of vectors. $\blacksquare$

3Dimension

The dimension of a vector space $V$ is the number of vectors in any basis of $V$. This is well-defined since, as we saw in the previous section, any two bases in a vector space contain the same number of vectors.

Examples:

  • $\dim(\mathbb F^n) = n$ (where $\mathbb F^n$ is a vector space over $\mathbb F$)
  • $\dim(\mathbb C) = 1$ over $\mathbb C$ (when considered over $\mathbb R$, its dimension would be 2)
  • $\dim(P_n(\mathbb F)) = n+1$ (because of the constant term for polynomials)

3.1Proposition 2.15-2.17

Let $V$ be a finite-dimensional vector space.

  1. Let $U$ be a subspace of $V$. Then, $\dim(U) \subseteq \dim(V)$. (Reason: any basis of $U$ is linearly independent in the ambient space, $V$, as well; so, theorem 2.6 applies.)
  2. Every spanning list for $V$ with length $\dim(v)$ is a basis for $V$.
  3. Every linearly independent list with length $\dim(V)$ is also a basis for $V$.

Proofs: left as an exercise.

3.2Theorem 2.18: Lunch in Chinatown

$$\dim(U_1 + U_2) = \dim(U_1) + \dim(U_2) - \dim(U_1 \cap U_2)$$

This is a fairly important identity relating the dimensions of the sum to the sum of the dimensions. Make sure you know how to prove this.

Proof: $(U_1 \cap U_2)$ is a finite-dimensional subspace of both $U_1$ and $U_2$. Let $B_1 = (u_1, \ldots, u_m)$ be a basis for this intersection. Since $B_1$ is linearly independent in both $U_1$ and $U_2$, we are able to extend it to a basis of $U_1$ and to a basis of $U_2$. We propose the following basis for $U_1$: $(u_1, \ldots, u_m, v_1, \ldots, v_j)$. So $\dim(U_1) = m + j$. For $U_2$, our proposed basis is: $(u_1, \ldots, u_m, w_1, \ldots, w_k)$, so $\dim(U_2) = m+k$. So the right-hand side of the equation above simplifies to $\dim(U_1) + \dim(U_2) - \dim(U_1 \cap U_2) = (m+j) + (m+k) - m = m + j + k$.

Now we need to come up with a suitable basis for $U_1 + U_2$. An obvious choice would be $B_2 = (u_1, \ldots, u_m, v_1, \ldots, v_j, w_1, \ldots, w_k)$. Let's prove that this is indeed a basis for $U_1 + U_2$. To do this, we will need to first show that it is a spanning set, and then that it is linearly independent.

Proof of spanning-set-ness: Let $x + y \in U_1 + U_2$ (where $x \in U_1$ and $y \in U_2$). We can write $x$ as a linear combination of basis vectors of $U_1$: $x = a_1u_1 + \ldots + a_mu_m + \ldots + b_1v_1 + \ldots b_jv_j$. Similarly, we can write $y$ as $y = c_1u_1 + \ldots + c_mu_m + \ldots + d_1w_1 + \ldots d_kw_k$. Then:

$$\begin{align}x + y & = (a_1u_1 + \ldots + a_mu_m + \ldots + b_1v_1 + \ldots b_jv_j) + (c_1u_1 + \ldots + c_mu_m + \ldots + d_1w_1 + \ldots d_kw_k) \\ & = (a_1 + c_1)u_1 + \ldots + (a_m + c_m)u_m + \ldots + b_1v_1 + \ldots b_jv_j + d_1w_1 + \ldots d_kw_k \end{align}$$

which tells us that any vector in $U_1 + U_2$ can be written as a combination of basis vectors in $B_2$. Thus $U_1 + U_2$ is spanned by $B_2$.

Now we just need to show that $B_2$ is linearly independent. To do this, we prove that if some linear combination of the vectors in $B_2$ results in the zero vector, then all coefficients must have been zero. The equation for the linear combination is:

$$a_1u_1 + \ldots + a_mu_m + \ldots + b_1v_1 + \ldots + b_jv_j + c_1w_1 + \ldots + c_kw_k = 0$$

We can rearrange this equation by moving the $u$ and $v$ terms to the right, leaving:

$$c_1w_1 + \ldots + c_kw_k = -a_1u_1 - \ldots - a_mu_m - \ldots -b_1v_1 - \ldots -b_jv_j$$

The vector on the left-hand side is clearly an element of $U_2$, since it's a linear combination of some of the basis vectors of $U_2$. It's also an element of $U_1$, since it's equal to the right-hand side (which is a linear combination of basis vectors of $U_1$). Thus $c_1w_1 + \ldots + c_kw_k \in U_1 \cap U_2$, and so we can write it as a linear combination of the vectors in $B_1$:

$$c_1w_1 + \ldots + c_kw_k = d_1v_1 + \ldots + d_mu_m$$

If we then subtract the two different representations for $c_1w_1 + \ldots + c_kw_k$ (which we'll now call $x$, for simplicity), we get the zero vector:

$$\begin{align}0 & = x - x \\ & = d_1v_1 + \ldots + d_mu_m - (-a_1u_1 - \ldots - a_mu_m - \ldots -b_1v_1 - \ldots -b_jv_j) \\ & = (d_1 + b_1)v_1 + \ldots + (d_m + b_m)v_j + a_1u_1 + a_mu_m\end{align}$$

But we know that $(u_1, \ldots, u_m, v_1, \ldots, v_j)$ is a basis for $U_1$, which means that the vectors must be linearly independent. So $a_i = 0$ for all $i$. If we substitute that into the equation for the linear combination, we get:

$$b_1v_1 + \ldots + b_jv_j + c_1w_1 + \ldots + c_kw_k = 0$$

and since $(u_1, \ldots, u_m, w_1, \ldots, w_k)$ is a basis for $U_2$, the vectors must be linearly independent in $U_2$ (and thus in $V$) and so the coefficients must all be zero. This completes the proof that the vectors are linearly independent, and so we can conclude that $B_2$ is a basis for $U_1 + U_2$. Since there are $m + j + k$ vectors in $B_2$, this shows that the two sides of the original equation are equal. $\blacksquare$

Note that there is no equivalent of lunch in chinatown for $> 2$ subspaces.

  1. As Loveys would call it. :(