Jump to content

Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 8

From Wikiversity



Dimension theory

A finitely generated vector space has many quite different bases. For example, if a system of homogeneous linear equations in variables is given, then its solution space is, due to fact, a linear subspace of , and a basis of the solution space can be found by constructing an equivalent system in echelon form. However, in the process of elimination, there are several choices possible, and different choices yield different bases of the solution space. It is not even clear whether the number of basic solutions is independent of the choices. In this section we will show in general that the number of elements in a basis of a vector space is constant and depends only on the vector space. We will prove this important property after some technical preparations, and we will take it as the starting point for the definition of the dimension of a vector space.


Let denote a field and let denote a -vector space, and let a basis be given. Let be a vector with a representation

where for some fixed . Then also the family

is a basis of .

We show first that the new family is a generating system. Because of

and , we can express the vector as

Let be given. Then we can write

To show the linear independence, we may assume to simplify the notation. Let

be a representation of . Then

From the linear independence of the original family we deduce . Because of , we get . Therefore and hence for all .


The preceding statement is called the basis exchange lemma, the following statement is called the basis exchange theorem.


Let denote a field and let denote a -vector space, and let a basis be given. Let

denote a family of linearly independent vectors in . Then there exists a subset

such that the family

is a basis of . In particular, .

We do induction over , the number of the vectors in the family. For , there is nothing to show. Suppose now that the statement is already proven for , and let linearly independent vectors

be given. By the induction hypothesis, applied to the vectors (which are also linearly independent)

there exists a subset such that the family

is a basis of . We want to apply the basis exchange lemma to this basis. As it is a basis, we can write

Suppose that all coefficients . Then we get a contradiction to the linear independence of , . Hence, there exists some with . We put . Then is a subset of with elements. By the basis exchange lemma, we can replace the basis vector by , and we obtain the new basis

  The final statement follows, since we have a subset with elements inside a set with elements.



We consider the standard basis of and the two linearly independent vectors and . We want to extend this family to a basis, using the standard basis and according to the inductive method described in the proof of basis exchange theorem. We first consider

Since no coefficient is , we can extend with any two standard vectors to obtain a basis. We work with the new basis

In a second step, we would like to include . We have

According to the proof we have to get rid of , as its coefficient is in this equation (we can not get rid of ). The new basis is therefore


Let be a field, and let be a -vector space with a finite generating system. Then any two bases of have the same number of vectors.

Let and denote two bases of . According to the basis exchange theorem, applied to the basis and the linearly independent family , we obtain . When we apply the theorem with roles reversed, we get , thus .


This theorem enables the following definition.


Let be a field, and let be a -vector space with a finite generating system. Then the number of vectors in any basis of , is called the dimension of , written

If a vector space is not finitely generated, then one puts . The null space has dimension . A one-dimensional vector space is called a line, a two-dimensional vector space a plane, a three-dimensional vector space a space (in the strict sense) but every vector space is called a space.


Let be a field and . Then the standard space has the dimension .

The standard basis , , consists of vectors, hence the dimension is .



The complex numbers form a two-dimensional real vector space, a basis is and .


The polynomial ring over a field is not a finite-dimensional vector space. To see this, we have to show that there is no finite generating system for the polynomial ring. Consider polynomials . Let be the maximum of the degrees of these polynomials. Then every -linear combination has at most degree . In particular, polynomials of larger degree can not be presented by , so these do not form a generating system for all polynomials.

The preceding statement follows also from the fact that, as shown in example, the powers form an infinite basis of the polynomial ring. Hence it can not have a finite basis, see exercise (the proof of fact only shows that two finite bases have the same length).


Let denote a finite-dimensional vector space over a field . Let denote a linear subspace. Then is also finite-dimensional, and the estimate

holds.

Set . Every linearly independent family in is also linearly independent in . Therefore, due to the basis exchange theorem, every linearly independent family in has a length . Suppose that has the property that there exists a linearly independent family with vectors in , but no such family with vectors. Let be such a family. This is then a maximal linearly independent family in . Therefore, due to fact, it is a basis of .


The difference

is also called the codimension of in .


Let be a field, and let be a -vector space with finite dimension . Let vectors in be given. Then the following properties are equivalent.

  1. form a basis of .
  2. form a generating system of .
  3. are linearly independent.

Proof



Let be a field. It is easy to get an overview over the linear subspaces of , as the dimension of a linear subspace equals  with , due to fact. For , there is only the null space itself, for there is the null space and itself. For , there is the null space, the whole plane , and the one-dimensional lines through the origin. Every line has the form

with a vector . Two vectors different from define the same line if and only if they are linearly dependent. For , there is the null space, the whole space , the one-dimensional lines through the origin, and the two-dimensional planes through the origin.


Let denote a finite-dimensional vector space over a field . Let

denote linearly independent vectors in . Then there exist vectors

such that

form a basis of .

Let be a basis of . Due to the basis exchange theorem, there are vectors from the basis which together with the given vectors form a basis of .


In particular, every basis of a linear subspace can be extended to a basis of .


<< | Linear algebra (Osnabrück 2024-2025)/Part I | >>
PDF-version of this lecture
Exercise sheet for this lecture (PDF)