Math 270 - Applied Linear Algebra
Lecture 1
1Problem 1: Solve:¶
$$ \begin{eqnarray*} \begin{bmatrix}1 & -2 & -1 & 3|1\\ 2 & -4 & 1 & 0|5\\ 1 & -2 & 2 & -3|4 \end{bmatrix} & \rightarrow & \begin{bmatrix}1 & -2 & -1 & 3\\ 2 & -4 & 1 & 0\\ 1 & -2 & 2 & -3 \end{bmatrix}\vec{x}=\begin{bmatrix}1\\ 5\\ 4 \end{bmatrix} \end{eqnarray*} $$
Recall general algorithm to reduce matrix into row echelon form (Gaussian Elimination).
Step 1. If the matrix has only zero entries then we stop.
Step 2. Find the first column from the left containing a nonzero entry $k$, move the row containing the entry to the top of the matrix.
Step 3. Multiply the first row by $\frac{1}{k}$ to create the first leading $1$
Step 4. Make each entry below this leading $1$ zero by subtracting multiples of the first row from remaining rows.
Step 5. Now apply 1-5 to the matrix consisting of the remaining rows.
This brings the matrix into row-echelon form. Now to make it into reduced row-echelon form use the rows with leading $1$'s to make the entries above the leading ones zero. Actually you can do this when doing Step 4.
Continuing:
$$ \begin{eqnarray*}
R_{2}-2R_{1},R_{3}-2R_{1} & \sim & \begin{bmatrix}1 & -2 & -1 & 3|1\\
0 & 0 & 3 & -6|3\\
0 & 0 & 3 & -6|3
\end{bmatrix}\\
\\
R_{3}-R_{2} & \sim & \begin{bmatrix}1 & -2 & -1 & 3|1\\
0 & 0 & 3 & -6|3\\
0 & 0 & 0 & 0|0
\end{bmatrix}\\
\\
R_{2}/3 & \sim & \begin{bmatrix}1 & -2 & -1 & 3|1\\
0 & 0 & 1 & -2|1\\
0 & 0 & 0 & 0|0
\end{bmatrix}\\
\\
R_{1}+R_{2} & \sim & \begin{bmatrix}1 & -2 & 0 & \,1\,\,|\,2\\
0 & 0 & 1 & -2\,|\,1\\
0 & 0 & 0 & 0|0
\end{bmatrix}:RREF(A)
\end{eqnarray*} $$
translating into equations:
$$ \begin{eqnarray*} x_{1}-2x_{2}+x_{4} & = & 2\\ x_{3}-2x_{4} & = & 1\\ 0 & = & 0 \end{eqnarray*} $$
and into solutions:
$$ \begin{eqnarray*} x_{2} & = & s\\ x_{4} & = & t\\ x_{1} & = & 2+2s-t\\ x_{3} & = & 1+2t \end{eqnarray*} $$
Other notation (the vector-parametric form of the general solution):
$$ X=\begin{bmatrix}x_{1}\\ x_{2}\\ x_{3}\\ x_{4} \end{bmatrix}=\begin{bmatrix}2+2s-t\\ s\\ 1+2t\\ t \end{bmatrix}=\begin{bmatrix}2\\ 0\\ 1\\ 0 \end{bmatrix}+s\begin{bmatrix}2\\ 1\\ 0\\ 0 \end{bmatrix}+t\begin{bmatrix}-1\\ 0\\ 2\\ 1 \end{bmatrix} $$
By substituting $s,t\in\Re$ we get some solution to the initial
system.
For example, $s=t=0$:
$$
\begin{bmatrix}1 & -2 & -1 & 3\\
2 & -4 & 1 & 0\\
1 & -2 & 2 & -3
\end{bmatrix}\begin{bmatrix}2\\
0\\
1\\
0
\end{bmatrix}=\begin{bmatrix}1\\
5\\
4
\end{bmatrix}
$$
Remarks: A general soltuion to an associated homogenous system $AX=0$
is $s\begin{bmatrix}2\\
1\\
0\\
0
\end{bmatrix}+t\begin{bmatrix}-1\\
0\\
2\\
1
\end{bmatrix}$, i.e.
$$ \begin{eqnarray*}
Null\left(A\right) & = & \mbox{span}\{ T_{1},T_{2}\} \\
\end{eqnarray*} $$
1.1Recall Rank:¶
A ton of theorems associated with rank:
$$ \begin{eqnarray*} rkA & = & \mbox{# of nonzero rows in rref(A)}\\ \\ & \overset{thm}{=} & \mbox{# of lin. independent rows}\overset{def}{=}\mbox{dim}\left(\mbox{row}\left(A\right)\right)\\ \\ & \overset{thm}{=} & \mbox{# of lin. independent cols}\overset{def}{=}\mbox{dim}\left(\mbox{col}\left(A\right)\right) \end{eqnarray*} $$
Exercise:
$rkA=2$. Find 2 indep rows in A. Express the third as a linear combination
of these two. And the same for columns ($C_{2}=\left(-2\right)\cdot C_{1}$)
1.1.1Example 2:¶
For which a is $D=\begin{bmatrix}1 & 1 & 1\\ 1 & 2 & a\\ 2 & a & 4 \end{bmatrix}$
Thm: A is invertible $\iff$ $\mbox{det}\left(A\right)=\left|A\right|\neq0$
In this case (expansion along the last column):
$$ \begin{eqnarray*}
det\left(A\right) & = & 1\cdot\left|\begin{matrix}1 & 2\\
2 & a
\end{matrix}\right|-a\cdot\left|\begin{matrix}1 & 1\\
2 & a
\end{matrix}\right|+4\left|\begin{matrix}1 & 1\\
1 & 2
\end{matrix}\right|\\
\\
& = & a-4-a(a-2)+4(2-1)\\
\\
& = & 3a-a^2=a(3-a)
\end{eqnarray*} $$
So $D$ is invertible $\iff$ $a\neq0$ and $a\neq 3$
1.1.1.1Recall Cofactor and Det¶
$$ \begin{eqnarray*} A_{m\times n} & = & \left[a_{ij}\right]_{m\times n}\\ \\ C_{ij} & = & \left(-1\right)^{i+j}\det\left(A_{ij}\right) \end{eqnarray*} $$ where $A_{ij}$ is the matrix $A$ with row $i$ and column $j$ deleted.
1.1.1.2Why is invertibility important?¶
if $A_{n\times n}$ is invertible, then any equation $AX=B$ has a
unique solution:
Since $A$ is invertible, $A^{-1}\left(AX\right)=A^{-1}B\rightarrow\left(A^{-1}A\right)X=A^{-1}B\rightarrow IX=A^{-1}B\rightarrow X=A^{-1}B$
1.1.1.3Transformations:¶
$T\cdot\Re^{n}\rightarrow\Re^{m}$ is a linear transformation $\overset{def}{\iff}$$\begin{cases} T\left(X+Y\right) & =T\left(X\right)+T\left(Y\right)\\ T\left(\lambda X\right) & =\lambda T\left(X\right) \end{cases}$.
Fact: Every linear transformation between $\Re^n$ and $\Re^m$ is a matrix transformation,
i.e. $T\left(X\right)=A\cdot X$
Problem: Find a matrix of a ration in 2D by angle $\theta$.
Let $R_{\theta}$ be rotation by $\theta$. Let $A$ be its matrix.
$$ \begin{eqnarray*}
\Re^{2} & \rightarrow & \Re^{2}\\
\\
\begin{bmatrix}x\\
y
\end{bmatrix} & \rightarrow & \begin{bmatrix}a & b\\
c & d
\end{bmatrix}\begin{bmatrix}x\\
y
\end{bmatrix}=\begin{bmatrix}ax+by\\
cx+dy
\end{bmatrix}\\
\\
\begin{bmatrix}x\\
y
\end{bmatrix} & \rightarrow & \begin{bmatrix}ax+by\\
cx+dy
\end{bmatrix}
\end{eqnarray*} $$
Idea: Compute $R_{\theta}\left(e_{1}\right),R_{\theta}\left(e_{2}\right)$
where $e_{1}=\begin{bmatrix}1\\
0
\end{bmatrix},e_{2}=\begin{bmatrix}0\\
1
\end{bmatrix}$.
In this way we compute the columns of $A$:
if $A=\begin{bmatrix}a & b\\ c & d \end{bmatrix}$ then $Ae_{1}=\begin{bmatrix}a & b\\ c & d \end{bmatrix}\begin{bmatrix}1\\ 0 \end{bmatrix}=\begin{bmatrix}a\\ c \end{bmatrix}$ and $Ae_{2}=\begin{bmatrix}b\\ d \end{bmatrix}$
Here (using a picture and definitions of trigonometric functions) we computed:
$R_{\theta}\left(e_{1}\right)=Ae_{1}=\begin{bmatrix}\cos\theta\\
\sin\theta
\end{bmatrix}$ and $R_{\theta}\left(e_{2}\right)=\begin{bmatrix}-\sin\theta\\
\cos\theta
\end{bmatrix}$
1.1.2Typical problem we will deal with later: Is A diagonalizable?*¶
Def: A is diagonalizable $\iff\exists P_{n\times n},\ \mbox{invertible}$
and where $P^{-1}AP$ is a diagonal matrix.
Finding the diagonalizable matrix:
\begin{enumerate}
* Characteristic polynomial: $c_{A}\left(x\right)\overset{def}{=}\det\left(xI_{n}-A\right)$
$$ \begin{eqnarray*} c_{A} & = & \det\left(x\begin{bmatrix}1 & 0\\ 0 & 1 \end{bmatrix}-\begin{bmatrix}\cos\theta & -\sin\theta\\ \sin\theta & \cos\theta \end{bmatrix}\right)\\ \\ & = & \begin{bmatrix}x-\cos\theta & -\sin\theta\\ \sin\theta & x-\cos\theta \end{bmatrix}\\ \\ & = & \left(x-\cos\theta\right)^{2}+\sin^{2}\theta=x^{2}-\left(2\cos\theta\right)x+1 \end{eqnarray*} $$
- Recall that eigenvalues of $A$ are roots of $c_{A}\left(x\right)$.
(but the definition of eigenvalue is: $\lambda$ is an eigenvalue
for $A\iff\exists X\neq0$ st $AX=\lambda X$)
Indeed, if $c_{A}=0\overset{def}{\iff}\det\left(xI_{n}-A\right)=0\iff\lambda I-A$
not invertible $\iff\left(\lambda I-A\right)X=0$ has a nonzero solution
$\iff\exists_{x\neq0}\, s.t.\, AX=\lambda X$
Here. $\Delta=4\cos^{2}\theta-4=4\left(\cos^{2}\theta-1\right)$.
Thus, A is diagonalizable $\iff\cos^{2}\theta=1\iff\theta=0,\pi$\end{enumerate}
Remark: So for $\theta=0,\pi$ the matrix is not diagonalizable. The field $\Re$ of real numbers should be blamed for this, as the polynomial has no real roots in this case. We will learn later that one can remedy this by introducing complex numbers. (The rotation matrices are diagonalizable over complex numbers).