Monday, January 14, 2013 CC-BY-NC
Span and linear independence

Maintainer: admin

Span, linear combinations, and linear independence/dependence.

1Span

A linear combination of vectors $(v_1, \ldots, v_n)$, where each $v_i \in V$, is the vector $v = \alpha_1 v_1 + \ldots + \alpha_n v_n, \; \alpha_I \in \mathbb F$.

The span of $(v_1, \ldots, v_n)$, written $\text{span}(v_1, \ldots, v_n)$ or equivalently as $\langle v_1, \ldots, v_n\rangle$, is the set of all possible linear combinations of $(v_1, \ldots, v_n)$. We could also write it as

$$\text{span}(v_1, \ldots, v_n) = \{\alpha_1 v_1 + \ldots + \alpha_n v_n \mid \alpha_I \in \mathbb F \}$$

Clearly, each vector $v_i$ is in the span. Also, the span is a subspace of $V$. Indeed, if the span of $(v_1, \ldots, v_n)$ is $V$, we say that $(v_1, \ldots, v_n)$ spans $V$!!! For example, $((1, 0, 0), (0, 1, 0), (0, 0, 1))$ spans $\mathbb F^3$.

1.1Finite and infinite dimensions

If a vector space $V$ is spanned by a finite list of vectors, then we say that $V$ has finite dimension. If it cannot be spanned by any finite list of vectors, then we say that $V$ has infinite dimension.

Examples:

  • $\mathbb F^n$, spanned by $(1, 0, \ldots)$, $(0, 1, \ldots)$, $\ldots$, $(0, \ldots, 1)$
  • $\mathbb P_n(\mathbb F)$ (polynomials), spanned by $(1, z, z^2, \ldots, z^n)$
  • $\mathbb P (\mathbb F)$ but with no upper limit on the degree. To show that the dimension of this vector is infinite, we use a proof by contradiction: assume that there is a finite list of vectors (polynomials), and the highest degree among all the polynomials is $n$. Then there is no way to make a polynomial of degree $n+1$, and yet, such a polynomial would be a valid one in the vector space. So then the list of vectors must be infinite.
  • $\mathbb F^{\infty}$, similar to above.

2Linear independence/dependence

A list of vectors $(v_1, \ldots, v_n)$ is said to be linearly independent if $a_1v_1 + \ldots + a_nv_n = 0$ if and only if $a_1, \ldots, a_n = 0$ (where $a_i \in \mathbb F$).

For example, given $((1, 1, 0), (1, 0, 1). (1, 2, -1))$, if $a_1 = 2$, $a_2 = -1$ and $a_3 =-1$, then the linear combination is 0, so these vectors are linearly dependent.

On the other hand, $(z+1, z, z^2)$ is linearly independent. Proof: $a_1(z+1) + a_2(z) + a_3(z^2) = a_1z + a_1 + a_2z + a_3z^3$ (by distributivity) $= a_1 + (a_1 + a_2)z + a_3z^2$. For this to be equal to 0, we must have that $a_1 = 0$ and $a_3 = 0$. But since $a_1 + a_2 = 0$, then $0 + a_2 = 0$ and so $a_2 = 0$. So $a_1 = a_2 = a_3 = 0$. $\blacksquare$

2.1An alternative definition

2.1.1Lemma 2.4

A list of vectors $(v_1, \ldots, v_n)$ is linearly dependent if and only if one vector can be written as a linear combination of the others.

Proof: ($\to$) assume that $(v_1, \ldots, v_n)$ is linearly dependent. That means that we can write $a_1v_1 + \ldots + a_nv_n= 0$ with $a_k \neq 0$ for some index $k$ (possibly more than one). Then we can write

$$a_kv_k = -\sum_{i \neq k} a_iv_i$$

and if we divide by $a_k$, we get

$$v_k = \frac{1}{a_k} -\sum_{i \neq k} a_iv_i = -\sum_{i \neq k} \frac{a_i}{a_k} v_i = \sum{i \neq k} -\frac{a_i}{a_k}v_i$$

and so we can express $v_k$ as a linear combination of the other vectors.

($\rightarrow$) assume that $\displaystyle v_k = \sum_{i \neq k} b_iv_i$ where $b_i \in \mathbb F$. Then $\displaystyle v_k - \sum_{i \neq k} b_iv_i = 0$, which is a linear combination with $a_k = 1$ and $a_i = -b_i$ for $i \neq k$. Thus we have shown linear dependence. $\blacksquare$

2.1.2Theorem 2.6

Let $V$ be a vector space of finite dimension which is spanned by $(v_1, \ldots, v_n)$. Let $m$ be the size of a linearly independent list of vectors in $V$. Then $m \leq n$.

Proof: The one given in class is almost inductive in nature but manages to avoid being an actual proof by induction. I don't really like it. To be continued.