In this lecture, we started chapter 6 (inner product spaces). This topic will not be covered on the midterm, but it will be a big part of the final, so don't neglect it.
1Inner product spaces¶
Definition: an inner product space on V is a function
V×V→F(u,v)↦⟨u,v⟩
such that the following conditions are satisfied:
- Positive definiteness: ⟨v,v⟩≥0 for all v∈V with equality if and only if v=0
- Linearity in the first argument: ⟨au+v,w⟩=a⟨u,w⟩+⟨v,w⟩
- Conjugate symmetry: ⟨v,w⟩=¯⟨w,v⟩
1.1Examples¶
The Euclidean inner product: V=Fn. Then ⟨(v1,…,vn),(w1,…,wn)⟩=v1¯w1+…+vn¯wn which is basically the dot product of the two vectors, but, like, conjugate. (If F=R then this is just the standard dot product operation that we all know and love.)
Integration over a particular region: V=Pm(F). Then
⟨p,q⟩=∫10p(x)¯q(x)dx
As an exercise, verify that this satisfies the 4 conditions above.
1.2Inner products as linear functions¶
For a fixed w∈V, we can define the function
fw:V→Fv↦⟨v,w⟩
Let's show that this function is linear:
fw(au+v)=⟨au+v,w⟩=a⟨u,w⟩+⟨v,w⟩=afw(u)+fw(u)✓
As an exercise, prove the following properties:
- ⟨v,0⟩=⟨0,v⟩=0
- ⟨u,av+w⟩=a⟨u,v⟩+⟨u,w⟩
From now on, we assume that V denotes an inner product space - i.e., a vector space equipped with some inner product.
1.3Norms¶
For v∈V, the norm of V, denoted by ‖v‖, is defined as
‖v‖=√⟨v,v⟩
Note that the radicand is always ≥0 due to positive definiteness.
1.3.1Examples¶
V=Fn, with the Euclidean inner product as defined before:
‖(z1,…,zn)‖=√|z1|2+…+|zn|2
V=P3(R), p(x)=x2+1:
‖p(x)‖=√⟨x2+1,x2+1⟩=√∫10(x2+1)2dx
1.3.2Properties¶
Just one for now. Proof: left as an exercise.
For any a∈F, v∈V:
‖av‖=|a|‖v‖
1.3.3Orthogonality¶
u,v∈V are orthogonal if ⟨u,v⟩=0. (Note that this is a reflexive relation.)
1.3.4The Pythagorean theorem¶
If ⟨u,v⟩=0, then:
‖u+v‖2=‖u‖2+‖v‖2
Proof:
‖u+v‖2=⟨u+v,u+v⟩=‖u‖2+‖v‖2+⟨u,v⟩⏟=0+⟨v,u⟩⏟also =0=‖u‖2+‖v‖2✓
which was a nice one-liner.
(To see that ⟨u,v⟩=⟨v,u⟩=0, just use conjugate symmetry. Proof left as an exercise for the astute reader.)
If we are working in Rn, then the converse of the theorem holds true as well: ‖u+v‖2=‖u‖2+‖v‖2 implies that ⟨u,v⟩=⟨v,u⟩=0.
1.3.5The Cauchy-Schwarz inequality¶
|⟨u,v⟩|≤‖u‖⋅‖v‖
Equality holds when u=λv for some λ∈F (i.e., when the two vectors are scalar multiples of each other).
Proof: let u,v∈V. If v=0, then we have 0 on both sides of the inequality and we're done; equality holds trivially. So let's assume that v≠0. Consider
u=⟨u,v⟩‖v‖2v+w
where w is some vector orthogonal to v. By the Pythagorean theorem,
‖u‖2=‖⟨u,v⟩‖v‖2v+w‖2=|⟨u,v⟩|2‖v‖2+‖w‖2⏟≥0≥|⟨u,v⟩|2‖v‖2
Consequently, by multiplying both sides by the denominator, we get
|⟨u,v⟩|2≤‖u‖2⋅‖u‖2
and if we take the square root of both sides then we get the Cauchy-Schwarz inequality.