Jump to content

Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 8/latex

From Wikiversity

\setcounter{section}{8}






\subtitle {Dimension theory}

A finitely generated vector space has many quite different bases. For example, if a system of homogeneous linear equations in $n$ variables is given, then its solution space is, due to Lemma 6.11 , a linear subspace of $K^n$, and a basis of the solution space can be found by constructing an equivalent system in echelon form. However, in the process of elimination, there are several choices possible, and different choices yield different bases of the solution space. It is not even clear whether the number of basic solutions is independent of the choices. In this section, we will show in general that the number of elements in a basis of a vector space is constant and depends only on the vector space. We will prove this important property after some technical preparations, and we will take it as the starting point for the definition of the dimension of a vector space.




\inputfactproof
{Vector space/Basis/Exchange lemma/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ denote a field, let $V$ denote a $K$-vector space, and let a basis \mathl{v_1 , \ldots , v_n}{} of $V$ be given.}
\factcondition {Let
\mathrelationchain
{\relationchain
{ w }
{ \in }{ V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be a vector with a representation
\mathrelationchaindisplay
{\relationchain
{ w }
{ =} { \sum_{i = 1}^n s_i v_i }
{ } { }
{ } { }
{ } { }
} {}{}{,} where
\mathrelationchain
{\relationchain
{ s_k }
{ \neq }{0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} for some fixed $k$.}
\factconclusion {Then also the family
\mathdisp {v_1 , \ldots , v_{k-1} , w, v_{k+1} , \ldots , v_n} { }
is a basis of $V$.}
\factextra {}
}
{

We show first that the new family is a generating system. Because of
\mathrelationchaindisplay
{\relationchain
{ w }
{ =} { \sum_{i = 1}^n s_i v_i }
{ } { }
{ } { }
{ } { }
} {}{}{} and
\mathrelationchain
{\relationchain
{ s_k }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} we can express the vector $v_k$ as
\mathrelationchaindisplay
{\relationchain
{ v_k }
{ =} { \frac{1}{ s_k} w - \sum_{i = 1}^{k-1} \frac{ s_i}{ s_k} v_i - \sum_{i = k+1}^{n} \frac{ s_i}{ s_k} v_i }
{ } { }
{ } { }
{ } { }
} {}{}{.} Let
\mathrelationchain
{\relationchain
{ u }
{ \in }{ V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be given. Then, we can write
\mathrelationchainalign
{\relationchainalign
{ u }
{ =} { \sum_{i = 1}^n t_i v_i }
{ =} { \sum_{i = 1}^{k-1} t_i v_i + t_k v_k + \sum_{i = k+1}^n t_i v_i }
{ =} { \sum_{i = 1}^{k-1} t_i v_i + t_k { \left(\frac{1}{ s_k}w - \sum_{i = 1}^{k-1} \frac{ s_i}{ s_k} v_i - \sum_{i = k+1}^{n} \frac{ s_i}{ s_k} v_i\right) } + \sum_{i = k+1}^n t_i v_i }
{ =} { \sum_{i = 1}^{k-1} { \left(t_i - t_k \frac{ s_i}{ s_k}\right) } v_i + \frac{ t_k }{ s_k}w + \sum_{i = k+1}^n { \left(t_i - t_k \frac{ s_i}{ s_k}\right) } v_i }
} {} {}{.}

To show the linear independence, we may assume
\mathrelationchain
{\relationchain
{ k }
{ = }{ 1 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} to simplify the notation. Let
\mathrelationchaindisplay
{\relationchain
{ t_1w + \sum_{i = 2}^n t_iv_i }
{ =} { 0 }
{ } { }
{ } { }
{ } { }
} {}{}{} be a representation of $0$. Then
\mathrelationchaindisplay
{\relationchain
{ 0 }
{ =} { t_1 w + \sum_{i = 2}^n t_iv_i }
{ =} { t_1 { \left(\sum_{i = 1}^n s_i v_i \right) } + \sum_{i = 2}^n t_iv_i }
{ =} { t_1 s_1v_1 + \sum_{i = 2}^n { \left(t_1 s_i+ t_i \right) } v_i }
{ } { }
} {}{}{.} From the linear independence of the original family, we deduce
\mathrelationchain
{\relationchain
{ t_1 s_1 }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Because of
\mathrelationchain
{\relationchain
{ s_1 }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} we get
\mathrelationchain
{\relationchain
{ t_1 }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Therefore,
\mathrelationchain
{\relationchain
{ \sum_{i = 2}^n t_iv_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} and hence
\mathrelationchain
{\relationchain
{ t_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} for all $i$.

}


The preceding statement is called the basis exchange lemma; the following statement is called the basis exchange theorem.




\inputfactproof
{Vector space/Basis/Exchange theorem/Fact}
{Theorem}
{}
{

\factsituation {Let $K$ denote a field, let $V$ denote a $K$-vector space, and let a basis \mathl{b_1 , \ldots , b_n}{} of $V$ be given.}
\factcondition {Let
\mathdisp {u_1 , \ldots , u_k} { }
denote a family of linearly independent vectors in $V$.}
\factconclusion {Then there exists a subset
\mathrelationchaindisplay
{\relationchain
{ J }
{ =} { \{ i_1, i_2 , \ldots , i_k \} }
{ \subseteq} { \{1 , \ldots , n \} }
{ =} { I }
{ } { }
} {}{}{} such that the family
\mathdisp {u_1 , \ldots , u_k, b_i, i \in I \setminus J} { , }
is a basis of $V$.}
\factextra {In particular,
\mathrelationchain
{\relationchain
{ k }
{ \leq }{ n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}}
}
{

We do induction over $k$, the number of the vectors in the family. For
\mathrelationchain
{\relationchain
{ k }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is nothing to show. Suppose now that the statement is already proven for $k$, and let \mathl{k+1}{} linearly independent vectors
\mathdisp {u_1 , \ldots , u_k, u_{k+1}} { }
be given. By the induction hypothesis, applied to the vectors \extrabracket {which are also linearly independent} {} {}
\mathdisp {u_1 , \ldots , u_k} { , }
there exists a subset
\mathrelationchain
{\relationchain
{ J }
{ = }{ \{ i_1, i_2 , \ldots , i_k \} }
{ \subseteq }{ \{1 , \ldots , n \} }
{ }{ }
{ }{ }
} {}{}{} such that the family
\mathdisp {u_1 , \ldots , u_k, b_i, i \in I \setminus J} { , }
is a basis of $V$. We want to apply the basis exchange lemma to this basis. As it is a basis, we can write
\mathrelationchaindisplay
{\relationchain
{ u_{k+1} }
{ =} { \sum_{j = 1}^k c_j u_j + \sum_{ i \in I \setminus J} d_i b_i }
{ } { }
{ } { }
{ } { }
} {}{}{.} Suppose that all coefficients
\mathrelationchain
{\relationchain
{ d_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Then we get a contradiction to the linear independence of
\mathcond {u_j} {}
{j=1 , \ldots , k+1} {}
{} {} {} {.} Hence, there exists some
\mathrelationchain
{\relationchain
{ i }
{ \in }{ I \setminus J }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} with
\mathrelationchain
{\relationchain
{ d_i }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} We put
\mathrelationchain
{\relationchain
{ i_{k+1} }
{ \defeq }{ i }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Then
\mathrelationchain
{\relationchain
{ J' }
{ = }{ \{ i_1, i_2 , \ldots , i_k, i_{k+1} \} }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} is a subset of \mathl{\{ 1 , \ldots , n \}}{} with \mathl{k+1}{} elements. By the basis exchange lemma, we can replace the basis vector \mathcor {} {b_{i_{k+1} }} {by} {u_{k+1}} {,} and we obtain the new basis
\mathdisp {u_1 , \ldots , u_k, u_{k+1}, b_i, i \in I \setminus J'} { . }
  The final statement follows, since we have a subset with $k$ elements inside a set with $n$ elements.

}





\inputexample{}
{

We consider the standard basis \mathl{e_1,e_2,e_3}{} of $K^3$ and the two linearly independent vectors \mathcor {} {u_1 = \begin{pmatrix} 3 \\2\\ 1 \end{pmatrix}} {and} {u_2 = \begin{pmatrix} 5 \\4\\ 2 \end{pmatrix}} {.} We want to extend this family to a basis, using the standard basis and according to the inductive method described in the proof of the basis exchange theorem. We first consider
\mathrelationchaindisplay
{\relationchain
{u_1 }
{ =} { 3 e_1 +2e_2 + e_3 }
{ } { }
{ } { }
{ } { }
} {}{}{.} Since no coefficient is $0$, we can extend $u_1$ with any two standard vectors to obtain a basis. We work with the new basis
\mathdisp {u_1,e_1,e_2} { . }
In a second step, we would like to include $u_2$. We have
\mathrelationchaindisplay
{\relationchain
{ u_2 }
{ =} { \begin{pmatrix} 5 \\4\\ 2 \end{pmatrix} }
{ =} { 2 \begin{pmatrix} 3 \\2\\ 1 \end{pmatrix} -e_1 }
{ =} { 2u_1 -e_1 +0e_2 }
{ } { }
} {}{}{.} According to the proof, we have to get rid of $e_1$, as its coefficient is $\neq 0$ in this equation \extrabracket {we can not get rid of $e_2$} {} {.} The new basis is, therefore,
\mathdisp {u_1,u_2,e_2} { . }

}




\inputfactproof
{Vector space/Finitely generated/Length of every basis/Fact}
{Theorem}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space with a finite generating system.}
\factconclusion {Then any two bases of $V$ have the same number of vectors.}
\factextra {}
}
{

Let
\mathrelationchain
{\relationchain
{ \mathfrak{ b } }
{ = }{ b_1 , \ldots , b_n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} and
\mathrelationchain
{\relationchain
{ \mathfrak{ u } }
{ = }{ u_1 , \ldots , u_k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} denote two bases of $V$. According to the basis exchange theorem, applied to the basis $\mathfrak{ b }$ and the linearly independent family $\mathfrak{ u }$, we obtain
\mathrelationchain
{\relationchain
{ k }
{ \leq }{ n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} When we apply the theorem with roles reversed, we get
\mathrelationchain
{\relationchain
{ n }
{ \leq }{ k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} thus
\mathrelationchain
{\relationchain
{ n }
{ = }{ k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

}


This theorem enables the following definition.


\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space with a finite generating system. Then the number of vectors in any basis of $V$ is called the \definitionword {dimension}{} of $V$, written


\mathdisp {\dim_{ K } { \left( V \right) }} { . }

}

If a vector space is not finitely generated, then one puts
\mathrelationchain
{\relationchain
{ \dim_{ K } { \left( V \right) } }
{ = }{ \infty }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} The null space $0$ has dimension $0$. A one-dimensional vector space is called a \keyword {line} {,} a two-dimensional vector space a \keyword {plane} {,} and a three-dimensional vector space a \keyword {space} {} \extrabracket {in the strict sense} {} {,} but every vector space is called a space.




\inputfactproof
{Standard space/K^n/Dimension n/Fact}
{Corollary}
{}
{

\factsituation {Let $K$ be a field, and
\mathrelationchain
{\relationchain
{n }
{ \in }{\N }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}}
\factconclusion {Then the standard space $K^n$ has the dimension $n$.}
\factextra {}
}
{

The standard basis
\mathcond {e_i} {}
{i = 1 , \ldots , n} {}
{} {} {} {,} consists of $n$ vectors; hence, the dimension is $n$.

}





\inputexample{}
{

The complex numbers form a two-dimensional real vector space; a basis is \mathcor {} {1} {and} {{ \mathrm i}} {.}

}




\inputexample{}
{

The polynomial ring
\mathrelationchain
{\relationchain
{R }
{ = }{K[X] }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} over a field $K$ is not a finite-dimensional vector space. To see this, we have to show that there is no finite generating system for the polynomial ring. Consider $n$ polynomials \mathl{P_1 , \ldots , P_n}{.} Let $d$ be the maximum of the degrees of these polynomials. Then every $K$-linear combination \mathl{\sum_{i=1}^n a_i P_i}{} has at most degree $d$. In particular, polynomials of larger degree can not be presented by \mathl{P_1 , \ldots , P_n}{,} so these do not form a generating system for all polynomials.

}

The preceding statement follows also from the fact that, as shown in Example 7.10 , the powers $X^n$ form an infinite basis of the polynomial ring. Hence, it can not have a finite basis, see Exercise 8.17 \extrabracket {the proof of Theorem 8.4 only shows that two finite bases have the same length} {} {.}




\inputfactproof
{Vector space/Linear subspace/Dimensions/Compare/Fact}
{Corollary}
{}
{

\factsituation {Let $V$ denote a finite-dimensional vector space over a field $K$. Let
\mathrelationchain
{\relationchain
{U }
{ \subseteq }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} denote a linear subspace.}
\factconclusion {Then $U$ is also finite-dimensional, and the estimate
\mathrelationchaindisplay
{\relationchain
{ \dim_{ K } { \left( U \right) } }
{ \leq} { \dim_{ K } { \left( V \right) } }
{ } { }
{ } { }
{ } { }
} {}{}{} holds.}
\factextra {}
}
{

Set
\mathrelationchain
{\relationchain
{n }
{ = }{ \dim_{ K } { \left( V \right) } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Every linearly independent family in $U$ is also linearly independent in $V$. Therefore, due to the basis exchange theorem, every linearly independent family in $U$ has length $\leq n$. Suppose that
\mathrelationchain
{\relationchain
{k }
{ \leq }{n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} has the property that there exists a linearly independent family with $k$ vectors in $U$ but no such family with \mathl{k+1}{} vectors. Let
\mathrelationchain
{\relationchain
{ \mathfrak{ u } }
{ = }{ u_1 , \ldots , u_k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be such a family. This is then a maximal linearly independent family in $U$. Therefore, due to Theorem 7.11 , it is a basis of $U$.

}


The difference
\mathdisp {\dim_{ K } { \left( V \right) } - \dim_{ K } { \left( U \right) }} { }
is also called the \keyword {codimension} {} of $U$ in $V$.






\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {IntersecciónEspacioVectorial.gif} }
\end{center}
\imagetext {} }

\imagelicense { IntersecciónEspacioVectorial.gif } {} {Marianov} {Commons} {public domain} {}




\inputexample{}
{

Let $K$ be a field. It is easy to get an overview over the linear subspaces of $K^n$, as the dimension of a linear subspace equals
\mathcond {k} {with}
{0 \leq k \leq n} {}
{} {} {} {,} due to Corollary 8.9 . For
\mathrelationchain
{\relationchain
{n }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is only the null space itself; for
\mathrelationchain
{\relationchain
{n }
{ = }{1 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space and $K$ itself. For
\mathrelationchain
{\relationchain
{n }
{ = }{2 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space, the whole plane $K^2$, and the one-dimensional lines through the origin. Every line $G$ has the form
\mathrelationchaindisplay
{\relationchain
{G }
{ =} { Kv }
{ =} { { \left\{ s v \mid s \in K \right\} } }
{ } { }
{ } { }
} {}{}{,} with a vector
\mathrelationchain
{\relationchain
{ v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Two vectors different from $0$ define the same line if and only if they are linearly dependent. For
\mathrelationchain
{\relationchain
{n }
{ = }{3 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space, the whole space $K^3$, the one-dimensional lines through the origin, and the two-dimensional planes through the origin.

}




\inputfactproof
{Vector space/Dimension n and n vectors/Equivalences/Fact}
{Corollary}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space with finite dimension
\mathrelationchain
{\relationchain
{n }
{ = }{ \dim_{ K } { \left( V \right) } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}}
\factcondition {Let $n$ vectors \mathl{v_1 , \ldots , v_n}{} in $V$ be given.}
\factconclusion {Then the following properties are equivalent. \enumerationthree {\mathl{v_1 , \ldots , v_n}{} form a basis of $V$. } {\mathl{v_1 , \ldots , v_n}{} form a generating system of $V$. } {\mathl{v_1 , \ldots , v_n}{} are linearly independent. }}
\factextra {}

}
{See Exercise 8.6 .}





\inputfactproof
{Vector space/Basis complement/Fact}
{Theorem}
{}
{

\factsituation {Let $V$ denote a finite-dimensional vector space over a field $K$.}
\factcondition {Let
\mathdisp {u_1 , \ldots , u_k} { }
denote linearly independent vectors in $V$.}
\factconclusion {Then there exist vectors
\mathdisp {u_{k+1} , \ldots , u_n} { }
such that
\mathdisp {u_1 , \ldots , u_k, u_{k+1} , \ldots , u_n} { }
form a basis of $V$.}
\factextra {}
}
{

Let \mathl{b_1 , \ldots , b_n}{} be a basis of $V$. Due to the basis exchange theorem, there are \mathl{n-k}{} vectors from the basis $\mathfrak{ b }$ that, together with the given vectors \mathl{u_1 , \ldots , u_k}{,} form a basis of $V$.

}


In particular, every basis of a linear subspace
\mathrelationchain
{\relationchain
{ U }
{ \subseteq }{ V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} can be extended to a basis of $V$.