# 19.4 Determinants

There are many way to think about determinants.

## 19.4.1 Combinatorial approach

###### Definition 19.4.1

We define the set $S_{n}$ as the set of bijective maps from $[n]\to[n]$.

###### Theorem 19.4.1

The algebra $(S_{n},\circ)$ is a group (where $\circ$ is the function composition operator).

Proof: function composition is associative (todo: link to proof) so all we must do is show that there exists a neutral element, which it is not unreasonable to guess would be the identity function (i.e. $e:[n]\to[n]$ where $e(i)=i$ for all $e\in[n]$) and then prove that all permutations are invertible (which we get for free because we know that all $\sigma\in S_{n}$ are bijective, and thus are invertible).

To show that $e$ is a neutral element, consider some $\sigma\in S_{n}$ for which,

$\displaystyle(e\circ\sigma)(x)$ | $\displaystyle=e(\sigma(x))$ | (19.114) | |||

$\displaystyle=\sigma(x)$ | As $e$ maps every function to the same point. | (19.115) |

The other way is proven in the same way.

###### Definition 19.4.2

The length of a permutation is the number of \sayinversions in the permutation. TODO: finish

###### Definition 19.4.3

One way to define the determinant is using \saypermutations, that is as

### Multiplicatively of the determinant

This is a very important property of the determinant.

The basic result we are after is that

###### Theorem 19.4.2

Let $A$ and $B$ be matrices, then

Proof: todo.

### Block multiplication property

###### Theorem 19.4.3

Let $\mathbb{K}$ be a field, and let $A\in\mathbb{K}^{m\times m}$, $B\in\mathbb{K}^{(m)\times(n-m)}$, $C\in\mathbb{K}^{(n-m)\times(n-m)}$ and $0\in\mathbb{K}^{(n-m)\times m}$ such that all entries in $\mathrm{0}$ are (surprise!) the zero element.

Let $M$ be the matrix whose entries come from the field $\mathbb{K}$ defined by

Then,

$\displaystyle\det(M)=\det(A)\times\det(C)$ | (19.119) |

Proof: we use the standard method for proving identities (if it looks like an identity, walks like an identity, and quacks like an identity then it probably is an identity). We start by considering $\det(M)$,

$\displaystyle\det(M)$ | $\displaystyle=\sum_{\sigma\in S_{n+m}}\left[(-1)^{\sigma}M_{1,\sigma(1)}M_{2,% \sigma(2)}...M_{n+m,\sigma(n+m)}\right]$ | (19.120) |

If we think carefully about what we want, we’d like to end up with something that looks a bit like $\det(A)\times\det(C)$, and thus we should try to split this into two parts - one for $\det(A)$ and one for $\det(C)$. We can try something like

$\displaystyle=\sum_{\sigma\in S_{n+m}}\left[(-1)^{\sigma}\underbrace{\prod_{1% \leqq i\leqq m}M_{i,\sigma(i)}}_{\text{to line up with $A$}}\underbrace{\prod_% {m<j\leqq n+m}M_{j,\sigma(j)}}_{\text{to line up with $B$}}\right]$ | (19.121) |

Here we have something which is close to what we want. Ideally (or, as they say \sayif there is any fairness in the universe) we would have a permutation $\sigma$ such that $1\leqq\sigma(i)\leqq m$ (note: this is the same $i$, with the same bounds as the one in the sum), as that way the index $1,\sigma(i)$ will be in $A$ (as $A$ is in the top-left corner and is an $m\times m$ matrix) and such that $m<\sigma(j)\leqq n+m$ (for the same reasons as the first). We can have this!

What we can do is split the sum into two parts; in the first part we will sum over all the desirable permutations, and in the second part we will sum over all the undesirable permutations. This is of course not a $\text{rigorous}^{\text{TM}}$ formulation. We really want to define a set $X\subseteq S_{n+m}$ such that for all $p\in X$, we have for all $1\leqq i\leqq m$ that $1\leqq p(i)\leqq n$ and also for all $m<j\leqq n+m$ it is the case that $m<p(j)\leqq n+m$. Then, we can partition $S_{n+m}$ as $S_{n+m}=X\cup X^{\complement}$ where $X^{\complement}$ denotes all the items in $S_{n+m}$ that are not in $X$ (which means $X\cap X^{\complement}=\emptyset$). Therefore,

$\displaystyle\det(M)$ | $\displaystyle=\sum_{\sigma\in X}\left[(-1)^{\sigma}\prod_{1\leqq i\leqq m}M_{i% ,\sigma(i)}\prod_{m<j\leqq n+m}M_{j,\sigma(j)}\right]$ | (19.123) | ||

$\displaystyle\quad+\sum_{\sigma\in X^{\complement}}\left[(-1)^{\sigma}\prod_{1% \leqq i\leqq n}M_{i,\sigma(i)}\prod_{n<j\leqq n+m}M_{j,\sigma(j)}\right]$ |

Note that the right-hand part of the sum is zero, as for each term in the sum there exists some $j$ such that $m<j\leqq n+m$ where $1\leqq\sigma(j)\leqq m$, that is we have some $M_{j,\sigma(j)}$ in the product such that this entry is $0$, i.e. the whole term must therefore be zero, so all the terms are zero, so the sum is zero, and thus this is the same as

$\displaystyle\det(M)$ | $\displaystyle=\sum_{\sigma\in X}\left[(-1)^{\sigma}\prod_{1\leqq i\leqq m}M_{i% ,\sigma(i)}\prod_{m<j\leqq n+m}M_{j,\sigma(j)}\right]$ | (19.124) |

Let us consider some permutation $\sigma\in X$. Note that we can write $\sigma=\sigma_{a}\circ\sigma_{b}$ where $a:[m]\to[m]$, $\sigma_{a}=(a(1),a(2),...,a(m),m+1,...,m+n)$, $b:[n-m]\to[n-m]$ and $(1,2,...,m,b(m+1),...,b(m+n))$, therefore we can write the sum as

$\displaystyle\det(M)$ | $\displaystyle=\sum_{a\in S_{m}\land b\in S_{n-m}}\left[(-1)^{a}(-1)^{b}\prod_{% 1\leqq i\leqq m}M_{i,a(i)}\prod_{m<j\leqq n+m}M_{j,b(j)}\right]$ | (19.125) | ||

$\displaystyle=\sum_{a\in S_{m}\land b\in S_{n-m}}\left[(-1)^{a}(-1)^{b}\prod_{% 1\leqq i\leqq m}M_{i,a(i)}\prod_{m<j\leqq n+m}M_{j,b(j)}\right]$ | (19.126) | |||

$\displaystyle=\sum_{a\in S_{m}}\left[\sum_{b\in S_{n-m}}\left[(-1)^{a}(-1)^{b}% \prod_{1\leqq i\leqq m}M_{i,a(i)}\prod_{m<j\leqq n+m}M_{j,b(j)}\right]\right]$ | (19.127) | |||

$\displaystyle=\sum_{a\in S_{m}}\left[(-1)^{a}\prod_{1\leqq i\leqq m}M_{i,a(i)}% \sum_{\sigma_{b}\in S_{n-m}}\left[(-1)^{b}\prod_{m<j\leqq n+m}M_{j,b(j)}\right% ]\right]$ | (19.128) | |||

$\displaystyle=\left(\sum_{a\in S_{m}}\left[(-1)^{a}\prod_{1\leqq i\leqq m}M_{i% ,a(i)}\right]\right)\left(\sum_{\sigma_{b}\in S_{n-m}}\left[(-1)^{b}\prod_{m<j% \leqq n+m}M_{j,b(j)}\right]\right)$ | (19.129) | |||

$\displaystyle=\det(A)\times\det(B)$ | (19.130) |

## 19.4.2 Using matrices

If you consider how differentiation from \sayfirst principles is not the most exciting thing (it’s definitely useful to know how to do it, but in general, from first principles is not how people differentiate), we have a similar thing with determinants.

This is illustrated by this example/theorem hybrid.

###### Theorem 19.4.4

For any $A\in\textsf{M}_{n\times n}(\mathbb{K})$ (read: any $n\times n$ matrix with entries in a field $\mathbf{K}$,

###### Example 19.4.1

Prove Theorem 19.4.4.

We could do this by considering the definition in Definition 19.116, but it’s easier to prove this by using elementary matrices.

We know that if $A$ is invertible, then using elementary row-operations we can reduce it to the identity matrix; i.e. there exist elementary matrices $E_{1},E_{2},...,E_{k}$ such that

As elementary matrices are invertible, from this we can infer that

$\displaystyle A$ | $\displaystyle=(E_{1}E_{2}...E_{k})^{-1}$ | (19.133) | ||

$\displaystyle=(E_{k})^{-1}(E_{2})^{-1}...(E_{1})^{-1}.$ | (19.134) |

Then, considering $A^{T}$, we get

$\displaystyle A^{T}$ | $\displaystyle=\Big{(}(E_{k})^{-1}(E_{k-1})^{-1}...(E_{1})^{-1}\Big{)}^{T}$ | (19.135) | ||

$\displaystyle=((E_{1})^{-1})^{T}...((E_{k-1})^{-1})^{T}((E_{k})^{-1})^{T}.$ | (19.136) |

These expressions are pretty similar; we know (todo: add proof) that the determinant of any elementary matrix is equal to its determinant, so applying the multiplicative property of determinants, we have shown that