Jump to content

Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 23/latex

From Wikiversity

\setcounter{section}{23}

The solution set of a homogeneous system of linear equations in $n$ variables over a field $K$ is a linear subspace of $K^n$. Quite often, this solution space is described as the set of all \quotationshort{linear combinations}{} of finitely many (simple) solutions. In this lecture, we develop the concepts to make this precise.






\subtitle {Generating systems}






\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {VectorGenerado.gif} }
\end{center}
\imagetext {The plane generated by two vectors $v_1$ and $v_2$ consists of all linear combinations
\mathrelationchain
{\relationchain
{ u }
{ = }{ s v_1+ t v_2 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}} }

\imagelicense { VectorGenerado.gif } {} {Marianov} {Commons} {PD} {}




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. Let \mathl{v_1 , \ldots , v_n}{} denote a family of vectors in $V$. Then the vector
\mathdisp {s_1v_1+s_2v_2 + \cdots + s_nv_n \text{ with } s_i \in K} { }
is called a \definitionword {linear combination}{} of this vectors

\extrabracket {for the \keyword {coefficient tuple} {} $(s_1 , \ldots , s_n)$} {} {.}

}

Two different coefficient tuples can define the same vector.




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. A family
\mathcond {v_i \in V} {}
{i \in I} {}
{} {} {} {,} is called a \definitionword {generating system}{} \extrabracket {or \definitionword {spanning system}{}} {} {} of $V$, if every vector
\mathrelationchain
{\relationchain
{v }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} can be written as
\mathrelationchaindisplay
{\relationchain
{v }
{ =} {\sum_{j \in J} s_j v_j }
{ } { }
{ } { }
{ } { }
} {}{}{,} with a finite subfamily
\mathrelationchain
{\relationchain
{J }
{ \subseteq }{I }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} and with
\mathrelationchain
{\relationchain
{ s_j }
{ \in }{ K }
{ }{ }
{ }{ }
{ }{ }
}

{}{}{.}

}

In $K^n$, the standard vectors
\mathcond {e_i} {}
{1 \leq i \leq n} {}
{} {} {} {,} form a generating system. In the polynomial ring \mathl{K[X]}{,} the powers
\mathcond {X^n} {}
{n \in \N} {}
{} {} {} {,} form an \extrabracket {infinite} {} {} generating system.




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. For a family
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} we set
\mathrelationchaindisplay
{\relationchain
{ \langle v_i ,\, i \in I \rangle }
{ =} { { \left\{ \sum_{i \in J} s_i v_i \mid s_i \in K , \, J \subseteq I \text{ finite subset} \right\} } }
{ } { }
{ } { }
{ } { }
} {}{}{,}

and call this the \definitionword {linear span}{} of the family, or the \definitionword {generated linear subspace}{.}

}

The empty set generates the null space\extrafootnote {This follows from the definition, if we use the convention that the empty sum equals $0$} {.} {.} The null space is also generated by the element $0$. A single vector $v$ spans the space
\mathrelationchain
{\relationchain
{ Kv }
{ = }{ { \left\{ s v \mid s \in K \right\} } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} For
\mathrelationchain
{\relationchain
{v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} this is a \keyword {line} {,} a term we will make more precise in the framework of dimension theory. For two vectors \mathcor {} {v} {and} {w} {,} the \quotationshort{form}{} of the spanned space depends on how the two vectors are related to each other. If they both lie on a line, say
\mathrelationchain
{\relationchain
{w }
{ = }{ s v }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} then $w$ is superfluous, and the linear subspace generated by the two vectors equals the linear subspace generated by $v$. If this is not the case \extrabracket {and \mathcor {} {v} {and} {w} {} are not $0$} {} {,} then the two vectors span a \quotationshort{plane}{.}

We list some simple properties for generating systems and linear subspaces.




\inputfactproof
{Vector space/Generating system and spanned subspace/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space.}
\factsegue {Then the following statements hold.}
\factconclusion {\enumerationtwo {For a family
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} of elements in $V$, the linear span is a linear subspace of $V$. } {The family
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} is a spanning system of $V$, if and only if
\mathrelationchaindisplay
{\relationchain
{ \langle v_i ,\, i\in I \rangle }
{ =} { V }
{ } { }
{ } { }
{ } { }
} {}{}{.} }}
\factextra {}

}
{See Exercise 23.3 .}






\subtitle {Linear independence}




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. A family of vectors
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} \extrabracket {where $I$ denotes a finite index set} {} {} is called \definitionword {linearly independent}{} if an equation of the form
\mathdisp {\sum_{i \in I} s_i v_i =0 \text{ with } s_i \in K} { }
is only possible when
\mathrelationchain
{\relationchain
{ s_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{}

for all $i$.

}


If a family is not linearly independent, then it is called \keyword {linearly dependent} {.} A linear combination
\mathrelationchain
{\relationchain
{ \sum_{i \in I} s_i v_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} is called a \keyword { representation of the null vector} {.} It is called the \keyword {trivial representation} {} if all coefficients $s_i$ equal $0$ and, if at least one coefficient is not $0$, a \keyword {nontrivial representation of the null vector} {.} A family of vectors is linearly independent if and only if one can represent with the family the null vector only in the trivial way. This is equivalent with the property that no vector of the family can be expressed as a linear combination by the others.




\inputexample{}
{

The standard vectors in $K^n$ are linearly independent. A representation
\mathrelationchaindisplay
{\relationchain
{\sum_{i = 1}^n s_i e_i }
{ =} { 0 }
{ } { }
{ } { }
{ } { }
} {}{}{} just means
\mathrelationchaindisplay
{\relationchain
{ s_1 \begin{pmatrix} 1 \\0\\ \vdots\\0 \end{pmatrix} + s_2 \begin{pmatrix} 0 \\1\\ \vdots\\0 \end{pmatrix} + \cdots + s_n \begin{pmatrix} 0 \\0\\ \vdots\\1 \end{pmatrix} }
{ =} { \begin{pmatrix} 0 \\0\\ \vdots\\0 \end{pmatrix} }
{ } { }
{ } { }
{ } { }
} {}{}{.} The $i$-th row yields directly
\mathrelationchain
{\relationchain
{ s_i }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

}




\inputexample{}
{

The three vectors \mathlistdisplay {\begin{pmatrix} 3 \\3\\ 3 \end{pmatrix}} {} {\begin{pmatrix} 0 \\4\\ 5 \end{pmatrix},} {and} {\begin{pmatrix} 4 \\8\\ 9 \end{pmatrix}} {} are linearly dependent. The equation
\mathrelationchaindisplay
{\relationchain
{ 4 \begin{pmatrix} 3 \\3\\ 3 \end{pmatrix} + 3 \begin{pmatrix} 0 \\4\\ 5 \end{pmatrix} -3 \begin{pmatrix} 4 \\8\\ 9 \end{pmatrix} }
{ =} { \begin{pmatrix} 0 \\0\\ 0 \end{pmatrix} }
{ } { }
{ } { }
{ } { }
} {}{}{} is a nontrivial representation of the null vector.

}




\inputfactproof
{Linearly independent/Simple properties/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ be a field, let $V$ be a $K$-vector space, and let
\mathcond {v_{ i }} {}
{i \in I} {}
{} {} {} {,} be a family of vectors in $V$.}
\factsegue {Then the following statements hold.}
\factconclusion {\enumerationsix {If the family is linearly independent, then for each subset
\mathrelationchain
{\relationchain
{ J }
{ \subseteq }{ I }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} also the family
\mathcond {v_i} {,}
{i \in J} {}
{} {} {} {,} is linearly independent. } {The empty family is linearly independent. } {If the family contains the null vector, then it is not linearly independent. } {If a vector appears several times in the family, then the family is not linearly independent. } {A single vector $v$ is linearly independent if and only if
\mathrelationchain
{\relationchain
{ v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} } {Two vectors \mathcor {} {v} {and} {u} {} are linearly independent if and only if $u$ is not a scalar multiple of $v$ and vice versa. }}
\factextra {}

}
{See Exercise 23.10 .}





\inputremark {}
{

The vectors \mathl{v_1 = \begin{pmatrix} a_{11} \\\vdots\\ a_{m1} \end{pmatrix} , \ldots , v_n = \begin{pmatrix} a_{1n} \\\vdots\\ a_{mn} \end{pmatrix} \in K^m}{} are linearly dependent if and only if the homogeneous linear system
\mathdisp {\begin{matrix} a _{ 1 1 } x _1 + a _{ 1 2 } x _2 + \cdots + a _{ 1 n } x _{ n } & = & 0 \\ a _{ 2 1 } x _1 + a _{ 2 2 } x _2 + \cdots + a _{ 2 n } x _{ n } & = & 0 \\ \vdots & \vdots & \vdots \\ a _{ m 1 } x _1 + a _{ m 2 } x _2 + \cdots + a _{ m n } x _{ n } & = & 0 \end{matrix}} { }
has a nontrivial solution.

}






\subtitle {Basis}




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. Then a linearly independent generating system
\mathcond {v_i \in V} {}
{i \in I} {}
{} {} {} {,}

of $V$ is called a \definitionword {basis}{} of $V$.

}




\inputexample{}
{

The standard vectors in $K^n$ form a basis. The linear independence was shown in Example 23.6 . To show that they also form a generating system, let
\mathrelationchaindisplay
{\relationchain
{v }
{ =} { \begin{pmatrix} b_1 \\b_2\\ \vdots\\b_n \end{pmatrix} }
{ \in} { K^n }
{ } { }
{ } { }
} {}{}{} be an arbitrary vector. Then we have immediately
\mathrelationchaindisplay
{\relationchain
{v }
{ =} { \sum_{i = 1}^n b_i e_i }
{ } { }
{ } { }
{ } { }
} {}{}{.} Hence, we have a basis, which is called the \keyword {standard basis} {} of $K^n$.

}




\inputfaktbeweisnichtvorgefuehrt
{Vector space/Characterizations of basis/Maximal/Minimal/Fact}
{Theorem}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space. Let
\mathrelationchain
{\relationchain
{ v_1 , \ldots , v_n }
{ \in }{ V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be a family of vectors.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationfour {The family is a basis of $V$. } {The family is a minimal generating system; that is, as soon as we remove one vector $v_i$, the remaining family is not a generating system any more. } {For every vector
\mathrelationchain
{\relationchain
{u }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is exactly one representation
\mathrelationchaindisplay
{\relationchain
{u }
{ =} { s_1 v_1 + \cdots + s_n v_n }
{ } { }
{ } { }
{ } { }
} {}{}{.} } {The family is maximally linearly independent; that is, as soon as some vector is added, the family is not linearly independent any more. }}
\factextra {}
}
{

Proof by ring closure. $(1) \Rightarrow (2)$. The family is a generating system. Let us remove a vector, say $v_1$, from the family. We have to show that the remaining family, that is \mathl{v_2 , \ldots , v_n}{,} is not a generating system anymore. So suppose that it is still a generating system. Then, in particular, $v_1$ can be written as a linear combination of the remaining vectors, and we have
\mathrelationchaindisplay
{\relationchain
{ v_1 }
{ =} { \sum_{i = 2}^n s_i v_i }
{ } { }
{ } { }
{ } { }
} {}{}{.} But then
\mathrelationchaindisplay
{\relationchain
{ v_1- \sum_{i = 2}^n s_i v_i }
{ =} { 0 }
{ } { }
{ } { }
{ } { }
} {}{}{} is a nontrivial representation of $0$, contradicting the linear independence of the family. $(2) \Rightarrow (3)$. Due to the condition, the family is a generating system, hence every vector can be represented as a linear combination. Suppose that for some
\mathrelationchain
{\relationchain
{ u }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is more than one representation, say
\mathrelationchaindisplay
{\relationchain
{ u }
{ =} { \sum_{i = 1}^n s_i v_i }
{ =} { \sum_{i = 1}^n t_i v_i }
{ } { }
{ } { }
} {}{}{,} where at least one coefficient is different. Without loss of generality, we may assume
\mathrelationchain
{\relationchain
{ s_1 }
{ \neq }{ t_1 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Then we get the relation
\mathrelationchaindisplay
{\relationchain
{ { \left( s_1 - t_1 \right) } v_1 }
{ =} { \sum_{i = 2}^n { \left( t_i- s_i \right) } v_i }
{ } { }
{ } { }
{ } { }
} {}{}{} Because of
\mathrelationchain
{\relationchain
{ s_1 - t_1 }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} we can divide by this number and obtain a representation of $v_1$ using the other vectors. In this situation, due to exercise *****, also the family without $v_1$ is a generating system of $V$, contradicting the minimality. $(3) \Rightarrow (4)$. Because of the unique representability, the zero vector has only the trivial representation. This means that the vectors are linearly independent. If we add a vector $u$, then it has a representation
\mathrelationchaindisplay
{\relationchain
{ u }
{ =} { \sum_{i = 1}^n s_i v_i }
{ } { }
{ } { }
{ } { }
} {}{}{,} and, therefore,
\mathrelationchaindisplay
{\relationchain
{ 0 }
{ =} { u- \sum_{i = 1}^n s_i v_i }
{ } { }
{ } { }
{ } { }
} {}{}{} is a non-trivial representation of $0$, so that the extended family \mathl{u,v_1 , \ldots , v_n}{} is not linearly independent. $(4) \Rightarrow (1)$. The family is linearly independent, we have to show that it is also a generating system. Let
\mathrelationchain
{\relationchain
{ u }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Due to the condition, the family \mathl{u,v_1 , \ldots , v_n}{} is not linearly independent. This means that there exists a non-trivial representation
\mathrelationchaindisplay
{\relationchain
{ 0 }
{ =} { s u + \sum_{i = 1}^n s_iv_i }
{ } { }
{ } { }
{ } { }
} {}{}{.} Here
\mathrelationchain
{\relationchain
{ s }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} because otherwise this would be a non-trivial representation of $0$ with the original family \mathl{v_1 , \ldots , v_n}{.} Hence, we can write
\mathrelationchaindisplay
{\relationchain
{ u }
{ =} { - \sum_{i = 1}^n \frac{ s_i}{ s } v_i }
{ } { }
{ } { }
{ } { }
} {}{}{,} yielding a representation for $u$.

}





\inputremark {{{{2}}}}
{

Let a basis \mathl{v_1 , \ldots , v_n}{} of a $K$-vector space $V$ be given. Due to Theorem 23.12   (3), this means that for every vector
\mathrelationchain
{\relationchain
{u }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there exists a uniquely determined representation
\mathrelationchaindisplay
{\relationchain
{u }
{ =} { s_1 v_1 + s_2 v_2 + \cdots + s_n v_n }
{ } { }
{ } { }
{ } { }
} {}{}{.} The elements
\mathrelationchain
{\relationchain
{s_i }
{ \in }{K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} \extrabracket {scalars} {} {} are called the \keyword {coordinates} {} of $u$ with respect to the given basis. Thus, for a fixed basis, we have a \extrabracket {bijective} {} {} correspondence between the vectors from $V$, and the coordinate tuples
\mathrelationchain
{\relationchain
{ (s_1,s_2 , \ldots , s_n) }
{ \in }{ K^n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} We express this by saying that a basis determines a \keyword {linear coordinate system} {\extrafootnote {Linear coordinates give a bijective relation between points and number tuples. Due to linearity, such a bijection respects addition and scalar multiplication. In many different contexts, also nonlinear \extrabracket {curvilinear} {} {} coordinates are important. These put points of a space and number tuples into a bijective relation. Examples are polar coordinates, cylindrical coordinates, and spherical coordinates. By choosing suitable coordinates, mathematical problems, like the computation of volumes, can be simplified.} {} {.}}

}




\inputfactproof
{Vector space/Finitely generated/Basis/Fact}
{Theorem}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space with a finite generating system.}
\factconclusion {Then $V$ has a finite basis.}
\factextra {}
}
{

Let
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} be a finite generating system of $V$ with a finite index set $I$. We argue with the characterization from Theorem 23.12   (2). If the family is minimal, then we have a basis. If not, then there exists some
\mathrelationchain
{\relationchain
{k }
{ \in }{I }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} such that the remaining family, where $v_k$ is removed, that is,
\mathcond {v_i} {}
{i \in I \setminus \{k\}} {}
{} {} {} {,} is also a generating system. In this case, we can go on with this smaller index set. With this method, we arrive at a subset
\mathrelationchain
{\relationchain
{J }
{ \subseteq }{I }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} such that
\mathcond {v_i} {}
{i \in J} {}
{} {} {} {,} is a minimal generating set, hence a basis.

}






\subtitle {Dimension theory}

A finitely generated vector space has many quite different bases. However, the number of elements in a basis is constant and depends only on the vector space. We will formulate this important property now and take it as the departure for the definition of dimension of a vector space.




\inputfaktbeweisnichtvorgefuehrt
{Vector space/Finitely generated/Length of every basis/Fact}
{Theorem}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space with a finite generating system.}
\factconclusion {Then any two bases of $V$ have the same number of vectors.}
\factextra {}
}
{

Let
\mathrelationchain
{\relationchain
{ \mathfrak{ b } }
{ = }{ b_1 , \ldots , b_n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} and
\mathrelationchain
{\relationchain
{ \mathfrak{ u } }
{ = }{ u_1 , \ldots , u_k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} denote two bases of $V$. According to the basis exchange theorem, applied to the basis $\mathfrak{ b }$ and the linearly independent family $\mathfrak{ u }$, we obtain
\mathrelationchain
{\relationchain
{ k }
{ \leq }{ n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} When we apply the theorem with roles reversed, we get
\mathrelationchain
{\relationchain
{ n }
{ \leq }{ k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} thus
\mathrelationchain
{\relationchain
{ n }
{ = }{ k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

}


This theorem enables the following definition.


\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space with a finite generating system. Then the number of vectors in any basis of $V$ is called the \definitionword {dimension}{} of $V$, written


\mathdisp {\dim_{ K } { \left( V \right) }} { . }

}

Due to the preceding theorem, the dimension is \keyword {well-defined} {.} If a vector space is not finitely generated, then one puts
\mathrelationchain
{\relationchain
{ \dim_{ K } { \left( V \right) } }
{ = }{ \infty }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} The null space $0$ has dimension $0$. A one-dimensional vector space is called a \keyword {line} {,} a two-dimensional vector space a \keyword {plane} {,} a three-dimensional vector space a \keyword {space} {} \extrabracket {in the strict sense} {} {} but every vector space is called a space.




\inputfactproof
{Standard space/K^n/Dimension n/Fact}
{Corollary}
{}
{

\factsituation {Let $K$ be a field, and
\mathrelationchain
{\relationchain
{n }
{ \in }{\N }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}}
\factconclusion {Then the standard space $K^n$ has the dimension $n$.}
\factextra {}
}
{

The standard basis
\mathcond {e_i} {}
{i = 1 , \ldots , n} {}
{} {} {} {,} consists of $n$ vectors; hence, the dimension is $n$.

}





\inputexample{}
{

The complex numbers form a two-dimensional real vector space; a basis is \mathcor {} {1} {and} {{ \mathrm i}} {.}

}




\inputexample{}
{

The polynomial ring
\mathrelationchain
{\relationchain
{R }
{ = }{K[X] }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} over a field $K$ is not a finite-dimensional vector space. To see this, we have to show that there is no finite generating system for the polynomial ring. Consider $n$ polynomials \mathl{P_1 , \ldots , P_n}{.} Let $d$ be the maximum of the degrees of these polynomials. Then every $K$-linear combination \mathl{\sum_{i=1}^n a_i P_i}{} has at most degree $d$. In particular, polynomials of larger degree can not be presented by \mathl{P_1 , \ldots , P_n}{,} so these do not form a generating system for all polynomials.

}




\inputfaktbeweisnichtvorgefuehrt
{Vector space/Linear subspace/Dimensions/Compare/Fact}
{Corollary}
{}
{

\factsituation {Let $V$ denote a finite-dimensional vector space over a field $K$. Let
\mathrelationchain
{\relationchain
{U }
{ \subseteq }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} denote a linear subspace.}
\factconclusion {Then $U$ is also finite-dimensional, and the estimate
\mathrelationchaindisplay
{\relationchain
{ \dim_{ K } { \left( U \right) } }
{ \leq} { \dim_{ K } { \left( V \right) } }
{ } { }
{ } { }
{ } { }
} {}{}{} holds.}
\factextra {}
}
{

Set
\mathrelationchain
{\relationchain
{n }
{ = }{ \dim_{ K } { \left( V \right) } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Every linearly independent family in $U$ is also linearly independent in $V$. Therefore, due to the basis exchange theorem, every linearly independent family in $U$ has length $\leq n$. Suppose that
\mathrelationchain
{\relationchain
{k }
{ \leq }{n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} has the property that there exists a linearly independent family with $k$ vectors in $U$ but no such family with \mathl{k+1}{} vectors. Let
\mathrelationchain
{\relationchain
{ \mathfrak{ u } }
{ = }{ u_1 , \ldots , u_k }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be such a family. This is then a maximal linearly independent family in $U$. Therefore, due to Theorem 23.12 , it is a basis of $U$.

}





\inputfactproof
{Vector space/Dimension n and n vectors/Equivalences/Fact}
{Corollary}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space with finite dimension
\mathrelationchain
{\relationchain
{n }
{ = }{ \dim_{ K } { \left( V \right) } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}}
\factcondition {Let $n$ vectors \mathl{v_1 , \ldots , v_n}{} in $V$ be given.}
\factconclusion {Then the following properties are equivalent. \enumerationthree {\mathl{v_1 , \ldots , v_n}{} form a basis of $V$. } {\mathl{v_1 , \ldots , v_n}{} form a generating system of $V$. } {\mathl{v_1 , \ldots , v_n}{} are linearly independent. }}
\factextra {}

}
{See Exercise 23.18 .}





\inputexample{}
{

Let $K$ be a field. It is easy to get an overview over the linear subspaces of $K^n$, as the dimension of a linear subspace equals
\mathcond {k} {with}
{0 \leq k \leq n} {}
{} {} {} {,} due to Corollary 23.20 . For
\mathrelationchain
{\relationchain
{n }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is only the null space itself; for
\mathrelationchain
{\relationchain
{n }
{ = }{1 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space and $K$ itself. For
\mathrelationchain
{\relationchain
{n }
{ = }{2 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space, the whole plane $K^2$, and the one-dimensional lines through the origin. Every line $G$ has the form
\mathrelationchaindisplay
{\relationchain
{G }
{ =} { Kv }
{ =} { { \left\{ s v \mid s \in K \right\} } }
{ } { }
{ } { }
} {}{}{,} with a vector
\mathrelationchain
{\relationchain
{ v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Two vectors different from $0$ define the same line if and only if they are linearly dependent. For
\mathrelationchain
{\relationchain
{n }
{ = }{3 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} there is the null space, the whole space $K^3$, the one-dimensional lines through the origin, and the two-dimensional planes through the origin.

}




\inputfaktbeweisnichtvorgefuehrt
{Vector space/Basis complement/Fact}
{Theorem}
{}
{

\factsituation {Let $V$ denote a finite-dimensional vector space over a field $K$.}
\factcondition {Let
\mathdisp {u_1 , \ldots , u_k} { }
denote linearly independent vectors in $V$.}
\factconclusion {Then there exist vectors
\mathdisp {u_{k+1} , \ldots , u_n} { }
such that
\mathdisp {u_1 , \ldots , u_k, u_{k+1} , \ldots , u_n} { }
form a basis of $V$.}
\factextra {}
}
{

Let \mathl{b_1 , \ldots , b_n}{} be a basis of $V$. Due to the basis exchange theorem, there are \mathl{n-k}{} vectors from the basis $\mathfrak{ b }$ that, together with the given vectors \mathl{u_1 , \ldots , u_k}{,} form a basis of $V$.

}