Monday, February 11, 2013 CC-BY-NC
Eigenvalues and eigenvectors

Maintainer: admin

Eigenvalues and the existence of their associated eigenvectors.

1Eigenvectors

An eigenvector, corresponding to an eigenvalue $\lambda$ of a linear operator $T$, is a non-zero vector $u$ such that $Tu = \lambda u$.

1.1Example

Let $T: \mathbb F^2 \to \mathbb F^2$, $(z, w) \mapsto (-w, z)$. For which $\lambda \in \mathbb F$ and which vectors $(z, w)$, do we have that $T(z, w) = \lambda (z, w)$? Well, $T(z, w) = (-w, z) = (\lambda z, \lambda w)$. So we must have that $-w = \lambda z$ and $z = \lambda w$. If we multiply both sides of the latter equation by $\lambda$, we get that $\lambda z = \lambda^2 w = -w$ and so $\lambda^2 = -1$.

Clearly this has no solutions if $\mathbb F = \mathbb R$. Thus $T$ has no eigenvalues over the reals. However, if we're working over the complex numbers, then $T$ has two eigenvalues: $\lambda_1 = i$ and $\lambda_2 = -i$, with associated eigenvectors of $(w, -iw)$ and $(w, iw)$ respectively.

$$T(w, -iw) = (iw, w) = i(w, -iw) \quad T(w, iw) = (-iw, w) = -i(w, iw)$$

(Geometrically, this linear operator corresponds to a counterclockwise rotation of $90\deg$. Obviously we cannot achieve this by multiplying a vector by a real scalar, so it makes sense that this only has eigenvalues in the complex numbers.)

1.2Theorem 5.6

Let $T \in \mathbb L(V)$. Suppose $\lambda_1, \ldots, \lambda_m$ are distinct eigenvalues of $T$, and that $v_1,\ldots, v_m$ are the corresponding non-zero eigenvectors (one eigenvector for each eigenvalue). Then, $(v_1, \ldots, v_m)$ is linearly independent.

Proof: by contradiction. Assume that $(v_1, \ldots, v_m)$ is linearly dependent. Let $v_k$ be the first vector in the list that is in the span of the others (that is, the first one that makes it linearly dependent if we go through them one-by-one). So $v_k \in \text{span}(v_1, \ldots, v_{k-1})$. So for some $a_i \in \mathbb F$, we have that

$$v_k = a_1v_1 + \ldots + a_{k-1}v_{k-1} \tag{1}$$

Applying $T$ on both sides of (1), we get (since $Tv_i = \lambda_i v_i$):

$$Tv_k = \lambda_k v_k = a_1\lambda_1v_1 + \ldots + a_{k-1}\lambda_{k-1}v_{k-1} \tag{2}$$

Next, let's multiply (1) by $\lambda_k$, and subtract (2) from the result:

$$\lambda_k v_k - \lambda_k v_k = 0 = a_1(\lambda_k - \lambda_1)v_1 + \ldots + a_{k-1}(\lambda_k - \lambda_{k-1})v_{k-1} \tag{3}$$

But we know that $(v_1, \ldots, v_{k-1})$ are linearly independent (since $v_k$ is the first vector in our list to make the set dependent). Thus all the coefficients in (3) must be zero. We also know that the $\lambda$s are all distinct (by assumption). So it must be that the $a$'s are zero. From this we conclude that $v_k = 0$. But this is a contradiction, since we assumed that all the eigenvectors must be non-zero.

Hence, it must be that the list is linearly independent. $\blacksquare$.

1.2.1Corollary

Each operator in $V$ has at most $\dim(V)$ distinct eigenvalues. This follows from the fact that the list of eigenvectors is linearly independent, and since there can be at most $\dim V$ linearly independent vectors in any vector space $V$, $\dim V$ is the upper limit for the number of distinct eigenvectors (and thus eigenvalues).

1.3Operations on operators

Let $T \in \mathcal L(V)$ and $m \in \mathbb N$. We define some standard arithmetic operations on linear operators (all of which result in linear operators), which will be useful for the next section:

  • $\displaystyle T^m = \underbrace{T \cdot T \cdots T}_{m \text{ times}}$, with $T^0 = I$ (the identity operator)
  • $T^mT^n = T^{m+n}$
  • $(T^m)^n = T^{mn}$
  • If $T$ is invertible and $m > 0$, then $T^{-m} = (T^{-1})^m$
  • If $p \in P(\mathbb F)$ is given by $p(z) = a_0 + \ldots + a_mz^m$, then $p(T) = a_0 I + a_1 T + a_2 T^2 + \ldots + a_m T^m$

1.4Theorem 5.10

Every operator in a finite-dimensional, non-zero, complex vector space has an eigenvalue.

Proof: Let $V$ be a complex vector space of dimension $n > 0$. Choose a non-zero $v \in V$. Let's look at $(v, Tv, \ldots, T^n v)$. Since there are $n+1$ vectors in that list, it is linearly dependent. So there exist coefficients $a_0, \ldots, a_n$, at least one of which is non-zero, such that $a_0v + a_1Tv + \ldots + a_nT^nv = 0$. We call that polynomial $p(Tv)$.

Now, let $m$ be the largest index such that $a_m \neq 0$. Note that $m \neq 0$; if it were, we would have $a_0 v = 0$ but neither $a_0$ nor $v$ is zero (by assumption). So $m > 0$. Also, $m \leq n$ by obviousity. From the fundamental theorem of algebra (which I don't actually recall learning in an algebra class), we know that we can factor $p(Tv)$ as follows:

$$\begin{align}p(Tv) & = a_0Iv + a_1Tv + \ldots + a_mT^m v \\ & = (a_0I + a_1T + \ldots + a_mT^m)v \tag{by distributivity} \\ & = c(T -\lambda_1 I) \cdots (T - \lambda_m I)v \tag{by FTA, where $c \in \mathbb C / \{0\}$} \end{align}$$

Hence at least one of the linear operators $T - \lambda_j$ is not injective, in order for it to send $v$ (on the RHS) to zero (when $v \neq 0$). Hence $T$ has an eigenvalue. This proof needs to be explained more clearly.