19.2 Vector spaces
19.2.1 The vector space axioms
There are eight axioms in total, but I find it easier to remember them this way:
Definition 19.2.1
A vector space is a set V over a field (elements of which are called \sayscalars) equipped with an operator called \sayvector addition which is an operator taking two elements of and returning a single element in , and an operator called \sayscalar multiplication which takes a scalar and a vector, and outputs a vector.
We have the following eight axioms,
-
1.
The first four axioms are equivalent to stating that must be an Abelian group.
-
2.
We have two kinds of distributivity, one is that if and , then
(19.40) -
3.
The second is that if and , then
(19.41) -
4.
The neutral element (e.g. in this is ) in has the following property,
(19.42) -
5.
We also have a kind of \saymultiplicative distributivity
(19.43)
Not exactly the most exciting stuff, but we can’t build castles without foundations! I’m not a structural engineer, but I’m pretty sure this is a true statement.
19.2.2 Linear independence
This is a concept in linear algebra.
Definition 19.2.2
Let be some vectors in a vector space V, and let be some scalars in the field (over which this vector space is defined).
We say these vectors are linearly independent if and only if
In words, this means \sayif the only values for all the s which satisify are when all the s are zero, then the vectors are linearly independent.
19.2.3 Bases
Change of basis
Let us suppose that we have two sets of basis vectors for the same vector space . It doesn’t really matter what we call them, but and are names as good as any. These vectors can be written in the form
(19.45) | |||
(19.46) |
For any vector we can always write it in the co-ordinate system by writing the vector as a linear combination55 5 This is always possible because is a basis for . of the vectors in . We can write this as
Where are such that
That is, they are the coefficients needed to write as a linear combination of . This also helps to understand why for example the vector space of symmetric matrices66 6 i.e. those in the form (19.49) is three dimensional; we can write every matrix as a vector of dimension where each coefficient denotes what to multiply each basis vector by to obtain our specific vector.
But what if we want to find a way to translate into ? This is actually doable using a single matrix. Here’s how. We start by applying the definition of , that is, we have that if and only if
To find , it is sufficient to find the in terms of the basis vectors in . How do we do this? A straightforward approach is to write every vector in in terms of those in and then to substitute for them, which removes all the -vectors and means that we instead have -vectors.
Because and are both basis for , we can write every vector in in terms of those in .
(19.51) | |||
(19.52) | |||
(19.53) | |||
(19.54) |
We can then substitute this into the linear combination of in terms of the basis vectors in , giving
(19.56) | ||||
(19.57) | ||||
(19.58) | ||||
(19.59) |
This looks scary, but we just need to stick to the definitions and keep our goal in mind; writing in terms of all the . We can move things around to obtain
(19.60) | ||||
(19.61) | ||||
(19.62) | ||||
(19.63) |
Therefore, we have that
(19.68) | ||||
(19.77) | ||||
(19.82) |
Alternatively - the change of basis matrix has as its th column the scalars needed to write the th element of the one basis as a linear combination of the others.
19.2.4 Subspaces
Definition 19.2.3
Let V be a vector space. The set W is a subspace of V if W is a vector space, and W is a subset of .
Technique 19.2.1
Showing that something is a subspace. Suppose we have a vector space V, and we want to prove that W is a subspace of V. The steps to do so are this
-
1.
Show that the zero vector is in the subspace in question.
-
2.
Show that using the standard technique for showing that something is a subset of something else (as in Section TODO: write).
-
3.
Then we must show that is closed under vector addition and scalar multiplication. The rest of the vector space axioms follow from the fact that and is a vector space.
This theorem is given as both an example of how to prove facts about vector spaces, but also because it is important in its own right.
Theorem 19.2.1
Let V be a vector space, and U and W be subspaces of V. Prove that is a subspace of if and only if or .
To prove this, first we will show the \sayif direction, and then the only if direction.
-
•
If. Without loss of generality, assume that , in which case , and this is a subspace of as is a subspace of . The proof for the other case follows by swapping and in the proof.
-
•
Only if. This direction requires a bit more of an intuition about what directions to explore. First we will assume that is a subspace, and then we will assume that the consequent is untrue (i.e. that is not true), in which case there exist such that
(19.83)We can then ask (this is the core idea in the proof which is not immediately obvious – to me at least), about the status of . As and by assumption is a subspace (and therefore by definition closed under vector addition) it must be that . Then either or (by definition of the set union).
-
1.
If , then also which is a contradiction as by definition of (Equation LABEL:definition_of_u_and_w) .
-
2.
If , a very similar thing is the case; also which is a contradiction as .
Therefore, by contradiction this direction of the theorem must be true.
-
1.