# Winter 2012 Final

Winter 2012 disastrously hard closed book MATH 223 final, by Wilbur Jonsson. No solutions yet, feel free to fill them in.

The original PDF is available through Docuum.

## 1Question 1¶

### 1.1Question¶

For each of the following subsets, decide whether it is or is not a subspace of the given vector space (justify your answers using the three part subspace criterion):

• (a) The subset $\left \{z \in \mathit{C} \mid \left \vert z \right \vert \leq 1 \right \}$ of the complex vector space $\mathit{C}$.
• (b) The subset of the real vector space of real valued functions of one variable consisting of differentiable functions.
• (c) The subset of the complex vector space of polynomials with complex coefficients consisting of those polynomials all of whose roots in $\mathit{C}$ are distinct.
• (d) The subset of the real vector space of polynomials with complex coefficients consisting of those polynomials of degree at most 10.
• (e) The intersections of any collection of subspaces of a vector space V.

### 1.2Solution¶

Recall that the three part subspace criterion for $W \leq V$ with $V, W$ over $\mathbb{F}$ is the following:

1. $0 \in W$
2. $u,v \in W \to u+v \in W$
3. $u \in W, c \in \mathbb{F} \to cu \in W$

(a)

Let $z_1, z_2 \in \mathbb{C}$ such that $z_1, z_2 \in W$. Then,

$$\begin{cases} |z_1| \leq 1\\ |z_2| \leq 1 \end{cases} \qquad \to \qquad |z_1|+|z_2| \leq 2 \quad \text{ but }\quad |z_1|+|z_2| \leq 1 \quad \text{not guaranteed}$$

For example, take a counterexample $z_1 = z_2 = 0.7$. Then, $z_1, z_2 \in W$ since $|z_1| \leq 1$ and $|z_2| \leq 1$, but $|z_1| + |z_2| = 1.4 > 1$ meaning $|z_1| + |z_2| \not \in W$. This violates criterion 2.

$W$ is not a subspace

(b)

$W = \{f(x) \in \mathbb{F} \ | \ f'(x) \in \mathbb{F}\}$ where $\mathbb{F} =$ real valued functions of one variable

1. $f(x) = 0 \in W$
2. Let $f(x), g(x) \in W$, then $(f(x) + g(x))' = f'(x) + g'(x)$. Both $f'(x)$ and $g'(x)$ are defined given $f, g$ are differentiable, so their sum is as well. Therefore, $f(x) + g(x) \in W$.
3. Let $f(x) \in W, c \in \mathbb{R}$, then $(cf(x))' = cf'(x)$. $f'(x)$ is defined and so is any of its multiples. Therefore, $cf(x) \in W$.

$W$ is a subspace

(c)

$W = \{f(x) = k(x-z_1)(x-z_2)\ldots(x - z_k) \in \mathbb{F} \ | \ z_1 \neq z_2 \neq \ldots \neq z_k, \quad k, z_1, z_2, \ldots z_k \in \mathbb{C}\}$ where $\mathbb{F} =$ complex vector space of polynomials with complex coefficients

Let $z_1 \neq z_2 \neq z_3$ be complex numbers and $f(x) = (x-z_1)(x-z_2), g(x) = (x-z_1)(x-z_3)$ so that $f(x), g(x) \in W$ since their roots are distinct.

Then,
$$f(x) + g(x) = (x-z_1)(x-z_2)+(x-z_1)(x-z_3) = (x-z_1)(x-z_2 +x -z_3) = 2(x-z_1)(x - \frac{z_2 + z_3}{2})$$

So, it is sufficient to choose $z_2 \neq z_3$ such that $z_1 = \frac{z_2 + z_3}{2}$ to show $f(x) + g(x) \not \in W$ and violate criterion 2.

$W$ is not a subspace

(d)

$W = \{p(x) = a_0 + a_1 x + \ldots + a_{10}x^{10} \ | a_i \in \mathbb{R}\}$

1. $p(x) = 0 \in W$ (zero polynomial)
2. (I'll combine it with criterion 3)

Let $f(x), g(x) \in W$, then $cf(x) + g(x) = c(a_0 + a_1 x + \ldots + a_{10}x^{10}) + (b_0 + b_1 x + \ldots + b_{10}x^{10}) = (ca_0 + b_0) + (ca_1 + b_1)x + \ldots + (ca_{10} + b_{10})x^{10} \in W$ since the coefficients are real.

$W$ is a subspace

(e)

$W = \{\cap_{i \in S} U_i \ | \ U_i \leq V \}$ with $S = \{1,2, \ldots \}$

Note: $\leq$ is Pr. Jonsson's notation for subspace.

1. $\ 0 \in U_i \quad \forall i \in S$ since this is part of a subspace's definition, therefore $0$ is also in their intersection, so $0 \in W$
2. Let $u, v \in W$, therefore $u$ and $v$ are in every subspace $U_i$ of V. Since subspaces are closed under addition (from criterion 2 precisely), then $u+v$ is also in every subspace $U_i$ of V hence, $u+v \in W$
3. A similar argument can be made here.

$W$ is a subspace

### 1.3Accuracy and discussion¶

Solution written by @eleyine. I still need someone to verify it.

Looks right - @dellsystem

## 2Question 2¶

### 2.1Question¶

Consider the polynomial $f(x)=x^3-x^2-5x-3=(x-3)(x+1)^2$ and define $U_f = \left \{ f(x)g(x) \mid g(x) \in \mathit{F} \left [ x \right ] \right \}$ whereby $\mathit{F} \left [ x \right ]$ consists of all polynomials in the indeterminate x with coefficients from the field $\mathit{F}$.

• (a) Find polynomials $a(x)$, $b(x)$ such that $a(x)(x-3) + b(x)(x+1)^2 = 1$
• (b) Show that the quotient space $V/U_f$ is the internal direct sum of $\text{Image} \hspace{2mm} T_{x-3}$ and $\text{Image} \hspace{2mm} T_{(x+1)^2}$ with the notation used in class. Why is $a(T)$ invertible on $\text{Image} \hspace{2mm} T_{(x-3)}$ ?
• (c) Find a basis for the factor space such that the matrix of the induced linear transformation has the form

$$\begin{pmatrix} 3 & 0 & 0 \\ 0 & -1 & 1 \\ 0 & 0 & -1 \end{pmatrix}$$

None

N/A

## 3Question 3¶

### 3.1Question¶

• (a) Consider the set $\mathit{V}$ of all sequences of elements of the field $\mathit{F}$ as vectors with countably many components, e.g. $x= \left ( x_1, x_2, ..., x_k, ... \right )$ with all $x_j \in \mathit{F}$ is a typical such vector.
Show that the sequences satisfying the recurrence relation $x_{n+2} = ax_{n+1} + bx_n$ for a two dimensional subspace of $\mathit{V}$.
• (b) With the two dimensional subspace of the previous part of this question in mind, solve the following difference equation explicitly by reducing it to a problem about the eigenvalues of a two by two matrix. (That is, find an explicit, nonrecursive formula for $x_n$.)

$$x_1 = 5, \hspace{5mm} x_2 = 3, \hspace{5mm} x_{n+2} = 3x_{n+1} + 4x_n \hspace{2mm} \text{for} \hspace{2mm} n\geq1$$

### 3.2Solution¶

(a)

We first note that $\mathbf{Z} = (1,0,x_3,...), \mathbf{Y} = (0,1,x_3,...)$ are a basis for sequences satisfying $x_{n+2} = \alpha x_{n+1}+ \beta x_n$ since $x_1 Z+x_2Y = (x_1, x_2, x_3, \ldots) = \mathbf{x}$. Now, to prove such sequences (that we call $W$) form a subspace of $V$, we use the three part criterion.

1. $\mathbf{0} = (0,0,\ldots) \in W$ obviously because it satisfies the recurrence relation $x_{n+2} = \alpha x_{n+1}+ \beta x_n \leftrightarrow 0 = \alpha \cdot 0+ \beta \cdot 0$.
2. Let $\mathbf{l} = (l_1, l_2, \ldots , l_n, \ldots) \in W$ and $\mathbf{m} = (m_1, m_2, \ldots , m_n, \ldots) \in W$. Then, $l_{n+2} = \alpha l_{n+1} + \beta l_{n}, \quad m_{n+2} = \alpha m_{n+1} + \beta m_{n}$. What of $(l_{n+2} + m_{n+2})?$
$$l_{n+2} + m_{n+2} = (\alpha l_{n+1} + \beta l_{n}) + (\alpha m_{n+1} + \beta m_{n}) = \alpha(l_{n+1}+ m_{n+1}) + \beta (l_n + m_{n}) \qquad \to \mathbf{l + m} = (l_1+m_1, l_2+m_2, \ldots , l_n+m_n, \ldots) \in W$$
3. Similarly,
$$cl_{n+2} = c(\alpha l_{n+1} + bl_{n})= \alpha (cl_{n+1})+ \beta (cl_n) \qquad \to \mathbf{cl} = (cl_1, cl_2, \ldots , cl_n, \ldots) \in W \qquad \blacksquare$$

(b)

We first express the recurrence relation as a matrix transformation.

$$\begin{bmatrix}x_{n+2} \\ x_{n+1} \end{bmatrix} = \begin{bmatrix}3 & 4 \\ 1 & 0 \end{bmatrix} \begin{bmatrix}x_{n+1} \\ x_{n} \end{bmatrix}$$

We then find the eigenvalues of the corresponding matrix.

$$det(A - \lambda I) = \begin{vmatrix}3-\lambda & 4 \\ 1 & -\lambda \end{vmatrix} = (3-\lambda)(-\lambda) - 4 = \lambda^2 - 3\lambda - 4 = (\lambda + 1)(\lambda -4)$$

Then, the explicit formula is given by $x_n = \alpha \lambda_1^n + \beta \lambda_2^n = \alpha (-1)^n + \beta (4)^n$ (magic, I guess?).

We find the values $\alpha, \beta$ using $x_1, x_2$ by solving the system:

$$\begin{cases} x_1 = \alpha \lambda_1^1 + \beta \lambda_2^1 = \alpha (-1) + \beta (4) \\ x_2 = \alpha \lambda_1^2 + \beta \lambda_2^2 = \alpha (-1)^2 + \beta (4)^2 \end{cases} \quad \leftrightarrow \quad \begin{cases} 5 = -\alpha + 4 \beta \\ 3 = \alpha + 16 \beta \end{cases}$$

which has solutions $\alpha = 5, \beta = \frac{5}{2}$.

So, $x_n = 5 \lambda_1^n + \frac{5}{2} \lambda_2^n$

### 3.3Accuracy and discussion¶

Solution by @eleyine. Someone should check accuracy.

## 4Question 4¶

### 4.1Question¶

• (a) State and prove Cramer's rule for solving a system on $n$ equations in $n$ unknowns based on the three axioms given in class for the definition of the determinant function.
• (b) Given two three by three matrices $\mathit{A}$ and $\mathit{B}$ with entries from a field $\mathit{F}$, define
$$\mathit{C} = \begin{pmatrix} \mathit{A} & \mathbf{0} \\ \mathit{-I} & \mathit{B} \end{pmatrix}$$
whereby $\mathbf{0}$ is a three by three block of zeros and $\mathit{I}$ is the tree by three identity matrix.
• i. Using only column and row operations, show that $\det \mathit{C} = -\det \begin{pmatrix} \mathit{-I} & \mathit{B} \\ \mathbf{0} & \mathit{AB} \end{pmatrix}$.
• ii. Using column operations and/or the row cofactor expansion, conclude $\det \mathit{A} \hspace{1mm} \det \mathit{B} = \det \mathit{C} = \det \left (\mathit{AB} \right )$

### 4.2Solution¶

(a) To prove Cramer's rule, use two properties:

1. adding (or subtracting) a multiple of a column to another does not change the determinant
2. $\det\begin{bmatrix} a_1, & \ldots, & b a_j, & \ldots, a_n \end{bmatrix} = b \det(A)$ where $A = \begin{bmatrix} a_1, & \ldots, & a_j, & \ldots, a_n \end{bmatrix}$

\begin{aligned} \begin{pmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,j} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,j} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n,j} & \dots & a_{n,n} \end{pmatrix} \begin{pmatrix} x_1 \\ \vdots \\ \vdots \\ x_n \end{pmatrix} &= \begin{pmatrix} b_1 \\ \vdots \\ \vdots \\ b_n \end{pmatrix} \\ \begin{vmatrix} a_{1,1} & a_{1,2} & \dots & b_{1} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & b_{2} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & b_{n} & \dots & a_{n,n} \end{vmatrix} &= \begin{vmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,1}x_1 + a_{1,2}x_2 + \dots + a_{1,j}x_j+ \dots + a_{1,n}x_n & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,1}x_1 + a_{2,2}x_2 + \dots + a_{2,j}x_j+ \dots + a_{2,n}x_n & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n,1}x_1 + a_{n,2}x_2 + \dots + a_{n,j}x_j+ \dots + a_{n,n}x_n & \dots & a_{n,n} \end{vmatrix} \\ &= \begin{vmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,j}x_j & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,j}x_j & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n,j}x_j & \dots & a_{n,n} \end{vmatrix} \qquad \text{using property 1} \\ &= x_j \begin{vmatrix} a_{1,1} & a_{1,2} & \dots & a_{1,j} & \dots & a_{1,n} \\ a_{2,1} & a_{2,2} & \dots & a_{2,j} & \dots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{n,1} & a_{n,2} & \dots & a_{n,j} & \dots & a_{n,n} \end{vmatrix} \qquad \text{using property 2} \end{aligned}

We obtain Cramer's rule by dividing each side by $\det \ A$.

### 4.3Accuracy and discussion¶

Cramer's rule proof by @eleyine.

## 5Question 5¶

### 5.1Question¶

• (a) Let $\mathit{V} = \mathit{C}^4$ as an inner product over the complex numbers $\mathit{C}$ with the usual inner product and define $\mathit{W} = \text{Span} \left \{ \left ( 1, i, -1, i \right )^t, \left ( -i, 1, i, -1 \right )^t \right \}$
• i. Find an orthonormal basis for $\mathit{W}^\perp$, the orthogonal complement of W.
• ii. Extend this basis to an othonormal basis of $\mathit{V}$.
• (b) Let $\mathit{U}$ be an inner product space over the complex numbers $\mathit{C}$ and $\mathit{H}$ a Hermitian opertor $\mathit{H} : \mathit{U} \to \mathit{U}$. Prove that eigenvectors of $\mathit{H}$ for distince eigenvalues are orthogonal.

None

N/A