# A calculus of the absurd

##### 20.6.2 Using matrices

If you consider how differentiation from “first principles” is not the most exciting thing (it’s definitely useful to know how to do it, but in general, from first principles is not how people differentiate), we have a similar thing with determinants.

This is illustrated by this example/theorem hybrid.

• Theorem 20.6.3

For any $$A \in \textsf {M}_{n \times n}(\mathbb {K})$$ (read: any $$n \times n$$ matrix with entries in a field $$\mathbf {K}$$,

$$\det (A) = \det (A^T)$$

• Example 20.6.1 Prove Theorem 20.6.3.

We could do this by considering the definition in Definition 20.126, but it’s easier to prove this by using elementary matrices.

We know that if $$A$$ is invertible, then using elementary row-operations we can reduce it to the identity matrix; i.e. there exist elementary matrices $$E_1, E_2, ..., E_k$$ such that

$$A (E_1 E_2 ... E_k) = I$$

As elementary matrices are invertible, from this we can infer that

\begin{align} A &= (E_1 E_ 2 ... E_k)^{-1} \\ &= (E_k)^{-1} (E_2)^{-1} ... (E_1)^{-1}. \end{align}

Then, considering $$A^T$$, we get

\begin{align} A^{T} &= \Big ((E_k)^{-1} (E_{k-1})^{-1} ... (E_1)^{-1}\Big )^{T} \\ &= ((E_1)^{-1})^{T} ... ((E_{k-1})^{-1})^{T} ((E_{k})^{-1})^{T}. \end{align}

These expressions are pretty similar; we know (todo: add proof) that the determinant of any elementary matrix is equal to its determinant, so applying the multiplicative property of determinants, we have shown that

$$\det (A) = \det (A^T).$$