**Maintainer:**admin

First class after the break. Continuing with chapter 6: orthogonal projections and minimisation problems, and an introduction to linear functionals and adjoints near the end. Midterms were handed back.

*1*Orthogonal complements¶

The **orthogonal complement** of a subspace $U \subseteq V$ is defined by

$$U^{\perp} = \{ v \in V : \langle u, v \rangle = 0 \, \forall u \in U \}$$

Thus it consists of all the vectors in $V$ that are orthogonal to *every* vector in $U$.

*1.1*Direct sum theorem¶

$$V = U \oplus U^{\perp}$$

(Where $U$ is finite-dimensional. No such restriction on $V$.)

Proof: First, we show that $V = U + U^{\perp}$: Let $(e_1, \ldots, e_n)$ be an orthonormal basis of $U$. Any $v \in V$ can be written as the following:

$$v = \underbrace{\langle v, e \rangle e_1 + \ldots + \langle v, e_n\rangle e_n}_{\in U} + \underbrace{v - (\langle v, e \rangle e_1 + \ldots + \langle v, e_n\rangle e_n)}_{\in U^{\perp}}$$

The first component is clearly in $U$ since it's just a linear combination of some basis vectors of $U$. To show that the second component is in $U^{\perp}$, we call it $w$. Then $\langle w, e_j \rangle = \langle v, e_j \rangle - \langle v, e_j \rangle = 0$ for each $j$ (the rest of the terms are zero as well because all of the basis vectors are pairwise orthogonal. Thus $w$ is orthogonal to any vector in the span of the orthonormal basis, which is to say, any vector in $U$. Thus $w \in U^{\perp}$.

Next, we need to show that $U \cap U^{\perp} = \{0\}$. Let $v$ be in their intersection. Then, since $v \in U^{\perp}$, it must be perpendicular to every vector in $U$, including $v$ itself. Thus $\langle v, v\rangle = 0$. By positive definiteness, we know that $v = 0$. $\blacksquare$

*1.1.1*Corollary¶

Any vector in $V$ can be written as the sum of $u \in U$ and $w \in U^{\perp}$. Proof: follows trivially from the theorem above.

*1.2*Orthogonal projections¶

We define an orthogonal projection, $P_u$, as follows:

$$\begin{align} P_u: V & \to U \\ v & \mapsto u \end{align}$$

where $u$ is the component in $U$ (see previous section). Thus we send any vector to the part of its decomposition that is in $U$. If $(e_1, \ldots, e_n)$ is an orthonormal basis for $U$, then $P_uv = \langle v, e_1\rangle e_1 + \ldots + \langle v, e_n \rangle e_n$.

*1.2.1*Proposition 6.36¶

Let $U$ be a subspace of $V$, and let $v \in V$ be fixed. Then, the distance between $v$ and $P_uv$ follows the following inequality:

$$\lVert v - P_uv \rVert \leq \lVert v- u \rVert$$

for any $u \in U$. So $P_uv$ is closest to $v$ among all vectors $u \in U$. Equality holds only when $u = P_uv$.

Proof: Let $u \in U$. Then:

$$\begin{align} \lVert v- u \rVert^2 & = \lVert v- P_uv + P_uv - u \rVert^2 \\ & \geq \lVert v- P_uv \rVert^2 + \underbrace{\lVert P_uv -u\rVert^2}_{\geq 0} \tag{by the triangle inequality} \\ & \geq \lVert v - P_uv \rVert^2 \tag{since the other component $\geq 0$} \end{align}$$

Then, if we take the square root of both sides, we get $\lVert v - u\rVert \geq \lVert v - P_uv\rVert$, as desired. When equality holds is given by the triangle inequality. $\blacksquare$

*1.2.2*Exercise¶

Find a polynomial $u \in P_2(\mathbb R)$ that approximates $\sin(x)$ on the interval $[-\pi, \pi]$. That is, we want to minimise

$$\int_{-\pi}^{\pi} |\sin(x) - u(x)|^2\,dx$$

Formulated in terms of a minimisation problem, the vector space $V$ is the set of all continuous functions from $-\pi$ to $\pi$, and $U = P_2(\mathbb R)$. We know that $P_uv$ is the minimum (where $v(x) = \sin(x)$), so we just need to find that. We can do this by finding an orthonormal basis of $P_2(\mathbb R)$ and then applying the projection.

To find the orthonormal basis, we use Gram-Schmidt. Recall that the standard basis is $\{1, x, x^2\}$.=, where $v_1 = 1$, $v_2 = x$, $v_3 = x^2$. Note that $\lVert v_1 \rVert = 1$ since it's just the integral of 1 from 0 to 1. Then:

$$\begin{align} e_1 & = \frac{v_1}{\lVert v_1 \rVert} = v_1 = 1 \\ e_2 & = \frac{v_2 - \langle v_2, e_1 \rangle e_1}{\lVert v_2 - \langle v_2, e_1 \rangle e_1 \rVert} = \frac{x - \int_0^1 x \,dx}{\lVert x - \int_0^1 x \,dx \rVert} = \frac{x - \frac{1}{2}}{\sqrt{\int_0^1 (x - \frac{1}{2})^2\,dx} } = \frac{x-\frac{1}{2}}{\sqrt{\frac{1}{12}}} = \sqrt{12}x - \sqrt{3} \\ e_3 & = x^2 -x + \frac{1}{6} \tag{skipping the derivation because it's tedious to type, but the answer is right don't worry} \end{align}$$

Then $P_uv = \langle v, e_1 \rangle e_1 + \langle v, e_2 \rangle e_2 + \langle v, e_3 \rangle e_3$. The conclusion of this is left as an exercise for the reader because I don't want to have to compute all those integrals.

*2*Linear functionals and adjoints¶

A **linear functional** is a linear map from $V$ to $\mathbb F$. An example: $\varphi: \mathbb F^3 \to \mathbb F$ which sends $(z_1, z_2, z_3)$ to $z_1-z_2 + 7z_3$. Or, more generally, $\varphi : V \to \mathbb F$ which sends $u$ to $\langle u, v \rangle$ where $v$ is some fixed vector in $V$. (The first example is just a special case of the last one, where $v = (1, -1, 7)$.)

We'll continue this next class.