# A calculus of the absurd

##### 22.7.6 Gram-Schmidt orthonormalisation

Gram-Schmidt orthonormalisation provides a useful way to turn a set of linearly independent vectors into a set of orthogonal vectors.

The key idea is that we build our set of vectors inductively, i.e. if our set of vectors is $$S$$ (a finite subset of some vector space $$V$$). Then we will order our set (doesn’t matter how, any ordering will do) and start to build sets $$S'_1$$, $$S'_2, ..., S'_{|S|}$$ such that $$S'_k$$ is an orthogonal set of $$k$$ vectors which satisfies

$$\Span (\text {first k vectors in S}) = \Span (S'_k).$$

Clearly the main thing which is missing here is the step which takes us from $$S'_{k}$$ to $$S'_{k+1}$$. There are a lot of ways to find this step,

• • Consider specific examples of linear vectors in well-known vector spaces (for example $$\mathbb {R}^2$$) and guess the formula 136136 Pun entirely unintended. for performing this orthonormalisation process.

• • Try to write a proof for our method and through this try to fill in the actual method.

I will try for the latter, because I think it is an approach which is much more fun. We will start by creating $$S'_1$$ by simply selecting the only element in the first one element of $$S$$ which is by itself orthogonal.

Now, suppose that we have constructed $$S'_k$$. We would like to find a way to build $$S'_{k+1}$$. Clearly we should add $$s_{k+1}$$ to this set, the question of course is how. We need our new vector, say $$s'_{k+1}$$ to be such that

\begin{align} \langle s'_{k+1}, s'_j \rangle = 0 & \forall j, 1 \leqq j \leqq k \end{align}