# Thursday, March 21, 2013

## 1Inner product spaces, self adjoing operators, eigenvalues and eigenvectors¶

$V$ inner product space over $\mathbb{C}$

$<\mathbf{v},\mathbf{w}>$ hermitian inner product ($<v,w> = \bar{<w,v>}$ etc)

$T: V \rightarrow V$ self adjoint linear transformation operator if and only if $<\mathbf{v},T\mathbf{w}> = <T\mathbf{v},\mathbf{w}>$ all $\mathbf{v},\mathbf{w} \in V$ (corresponds to Hermitian matrices)

In a finite dimensional space the matrix can be diagonalized with orthonormal vectors

Hypotheses: Standard basis orthonormal with respect to the complex inner product

Consider $<\mathbf{v},T_{A}\mathbf{w}> = (\mathbf{v})^{t}\bar{(A\mathbf{w})} = (v)^{t}\bar{A}\bar{\mathbf{w}} = (\bar{A}^t\mathbf{v})^{t}(\mathbf{w})$

A Hermitian means $\bar{A}^t = A$ so $T_A$ is self adjoint when $A$ is hermitian.

Theorem $V$ inner product space over $\mathbb{C}$

$T: \ V \rightarrow V$ self adjoint linear transformation (operator)

$\lambda$ eigenvalue of $T$ and $\mathbf{v} \neq \mathbf{0}$ an eigenvector for $\lambda$

Conclusion $\lambda is a real number i.e. \lambda = \bar{\lambda}$

Proof Let $\mathbf{v}$ be this eigenvector, then $T\mathbf{v} = \lambda\mathbf{v}, \mathbf{v} \neq \mathbf{0}$

• self adjoint $<T\mathbf{v},\mathbf{v}>$

• eigenvector for $\lambda <\lambda\mathbf{v},\mathbf{v}>$

$\to$ inner product properties

\begin{aligned} \lambda <\mathbf{v},\mathbf{v}> &= <\mathbf{v},\mathbf{v}>\lambda \\ \text{hence} \quad (\lambda - \bar{\lambda} < \mathbf{v}, \mathbf{v}> &= 0 \neq <\mathbf{v},\mathbf{v}> \\ \lambda - \bar{\lambda} &= 0 \\ \lambda &= \bar{\lambda}\quad \text{ i.e. }\lambda \text{ real} \end{aligned}

Theorem $V$ inner product spaces over $\mathbb{C}$

$T: V \rightarrow V$ self adjoint

$\mathbf{0} \neq \mathbf{v_1}$ eigenvector for $T$ and $\lambda_1$

$\mathbf{0} \neq \mathbf{v_2}$ eigenvector for $T$ and $\lambda_2$

Conclusion $\lambda_1$ and $\lambda_2$ implies $<\mathbf{v_1},\mathbf{v_2}> = 0$

Test $<T\mathbf{v_1},\mathbf{v_2}> = <\mathbf{v_1},T\mathbf{v_2}>$

Hence $<\lambda_1\mathbf{v_1},\mathbf{v_2}> = <\mathbf{v_1},\lambda_{2}\mathbf{v_2}>$

Recall $\lambda_1,\lambda_2$ are real.

Aside:

\begin{aligned} \lambda_1 < \mathbf{v_1}, \mathbf{v_2}> &= < \mathbf{v_1}, \mathbf{v_2}> \bar{\lambda_2} \\ (\lambda_1 - \bar{\lambda_2}) < \mathbf{v_1}, \mathbf{v_2}> &= 0 \qquad \text{ but } \bar{\lambda_2} = \lambda_2 \text{ and } \lambda_1 - \lambda_2 \neq 0 \\ \text{ forcing } < \mathbf{v_1}, \mathbf{v_2}> &= 0\end{aligned}

Now assume dim $V = n < \infty$

Eigenvalue of $A$ such that $(A-\lambda I)$ is not invertible i.e. $det A-\lambda{I}$ = 0

$det(A-\lambda{I}) = \sum$ sign $\sigma$ product of entries on a generalized diagonal

$\begin{bmatrix} a_{11} - \lambda & a_{12} & \ldots & a_{1n} \\ a_{21} & a_{22} - \lambda & \ldots & a_{2n} \\ \ldots & \ldots & \ldots & \ldots \\ a_{m1} & a_{m+1 \ 1} & \ldots & a_{mn} -\lambda \end{bmatrix} = (a_{11} - \lambda)(a_{22} - \lambda \ldots (a_{nn} - \lambda)$ etc

Hence $\mathrm{det}(A-\lambda{I})$ is a polynomial of degree exactly $n = dim V in \lambda$

Definition $\mathrm{Trace} A = a_{11} + a_{22} + \ldots + a_{nn}$

Note constant term in $det(A-\lambda{I}) = 0$ is $\mathrm{det} A (set \lambda = 0)$

## 2Gaussâ€™s fundamental theorem of algebra¶

If $f(\lambda)$ is a polynomial of degree $n > 0 \in \lambda$ then there is $r \in \mathbb{C}$ such that $f(r)=0$.

Factor Theorem then implies $f(\lambda) = (\lambda-r_1) \ldots (\lambda - r_n)$

$f(r) = 0 \rightarrow f(\lambda) = (\lambda - r)q(\lambda)$

$2x^{2} + 3x + 4 \\ f(r) = 0 \\ \text{ then } f(\lambda) = (\lambda - r)q(\lambda) + \rho(\lambda)$

Definition: $A$ is $n\times n, \ n < \infty$

$X_A(\lambda) = det(\lambda{I}-A) = (-1)^{n}det(A-\lambda I)$

$X_A(\lambda) = \lambda^{n} - (traceA)\lambda^{n-1}$ + other terms degree between one and $n-2 + (-1)^{n}\mathrm{det} A$

Theorem $V$ finite dimensions $(dimV - n < \infty)$ inner product space over $\mathbb{C}$

Conclusion there is an arthonormal basis of $V$ consisting entirely of eigenvectors of $A$

There is a unitary matrix $P$ such that $P^{-1}AP = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_n \end{pmatrix}$

Unitary: $\bar{P}^t = P^{-1}$ continuous on $P$ and orthonormal basis

Proof: Let $E_{\lambda_j}$ = eigenspaces for the eigenvector $\lambda_j$ of $A$ (then $dimE_{\lambda_j} \geq 1$)

Claim (proof later) $\mathbf{v}$ an eigenvector for $\lambda$

Then $\mathbf{v^{\perp}}$ is an inveriant subspace under the linear transformation induced on $V$ by $A$.

Extend $\{\mathbf{v} = \mathbf{v_1}\}$ to an orthonormal basis of V, say $\{\mathbf{v_1},\mathbf{v_2}, \ldots \mathbf{v_n}\}$

Then $\{\mathbf{v_2}, \ldots \mathbf{v_n}\}$ and $\{\mathbf{v_2}, \ldots \mathbf{v_n}\}$ is an orthonormal basis of $\mathbf{v_1}^+$

$A$ then induces a self adjoint linear transformation on $\mathbf{v_{1}^{\perp}}$

Induction there is an orthonormal basis of eigenvectors for $\mathbf{v_{1}^{\perp}}$ say $\{\hat{\mathbf{v_2}} \ldots \hat{\mathbf{v_n}} \}$

Our required orthonormal basis of eigenvectors is then $\{ \frac{\mathbf{v_1}}{\|v_1\|}, \hat{\mathbf{v_2}}, \ldots ,\hat{\mathbf{v_n}} \}$

this proves a and b follows exactly.

Proof of claim: re $\mathbf{v^{\perp}}, \mathbf{v} \neq 0$ eigenvector for $\lambda$

$T$ self adjoint (corresponding to hermitian $A$)

$\mathbf{w} \in \mathbf{v^{\perp}} \to <\mathbf{v},\mathbf{w}> = 0$ but $<\mathbf{v},T\mathbf{w}> = <T\mathbf{v},\mathbf{w}> = <\lambda\mathbf{v},\mathbf{w}>$ ($<T\mathbf{v},\mathbf{w}>$ self adjoint)

Hence we have shown $\mathbf{v}$ eigenvector of self adjoint $T$

$\mathbf{w} \in \mathbf{v^{\perp}} \to T\mathbf{w} \in \mathbf{v^{\perp}}$

Hermitiam matrices in a self adjoint matrix are equivalent to $$w_1,w_2 \in v^{\perp} \qquad <w_1,Tw_2> = <Tw_1,w_2> \qquad Tw_1 \in v^{\perp} \qquad Tw_2 \in v^{\perp}$$

Recall eigenspace of $\lambda E_{\lambda} = \{ \lambda{v} \in V | T\mathbf{v} = \lambda\mathbf{v}$

Note In our case $\mathrm{dim} E_\lambda = \text{ multiplicity of } \lambda$ and $V = E_{\lambda_1}\bigoplus \ldots \bigoplus E_{\lambda_k}$