Jump to content

Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 6/latex

From Wikiversity

\setcounter{section}{6}






\subtitle {Vector spaces}






\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {Vector Addition.svg} }
\end{center}
\imagetext {The addition of two arrows $a$ and $b$, a typical example for vectors.} }

\imagelicense { Vector Addition.svg } {} {Booyabazooka} {Commons} {PD} {}

The central concept of linear algebra is a vector space.


\inputdefinition
{ }
{

Let $K$ denote a field, and $V$ a set with a distinguished element
\mathrelationchain
{\relationchain
{0 }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} and with two mappings
\mathdisp {+ \colon V \times V \longrightarrow V , (u,v) \longmapsto u+v} { , }
and
\mathdisp {\cdot \colon K \times V \longrightarrow V , (s,v) \longmapsto s v = s \cdot v} { . }
Then $V$ is called a \definitionwordpremath {K}{ vector space }{} \extrabracket {or a vector space over $K$} {} {,} if the following axioms hold \extrabracket {where \mathcor {} {r,s \in K} {and} {u,v,w \in V} {} are arbitrary} {} {.}

\enumerationeight {
\mathrelationchain
{\relationchain
{ u+v }
{ = }{v+u }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{(u+v)+w }
{ = }{ u +(v+w) }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{ v+0 }
{ = }{v }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {For every $v$, there exists a $z$ such that
\mathrelationchain
{\relationchain
{v+z }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{1 \cdot u }
{ = }{ u }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{ r(su) }
{ = }{ (rs) u }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{ r(u+v) }
{ = }{ru + rv }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} } {
\mathrelationchain
{\relationchain
{ (r+s) u }
{ = }{ru + su }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

}

}

The binary operation in $V$ is called (vector-)addition, and the operation $K \times V \rightarrow V$ is called \keyword {scalar multiplication} {.} The elements in a vector space are called \keyword {vectors} {,} and the elements
\mathrelationchain
{\relationchain
{r }
{ \in }{K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} are called \keyword {scalars} {.} The null element
\mathrelationchain
{\relationchain
{ 0 }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} is called \keyword {null vector} {,} and for
\mathrelationchain
{\relationchain
{v }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} the inverse element, with respect to the addition, is called the \keyword {negative} {} of $v$, denoted by $-v$.

The field that occurs in the definition of a vector space is called the \keyword {base field} {.} All the concepts of linear algebra refer to such a base field. In case
\mathrelationchain
{\relationchain
{K }
{ = }{\R }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} we talk about a \keyword {real vector space} {,} and in case
\mathrelationchain
{\relationchain
{K }
{ = }{ \Complex }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} we talk about a \keyword {complex vector space} {.} For real and complex vector spaces, there exist further structures like length, angle, inner product. But first we develop the algebraic theory of vector spaces over an arbitrary field.






\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {Vector_space_illust.svg} }
\end{center}
\imagetext {} }

\imagelicense { Vector space illust.svg } {} {Oleg Alexandrov} {Commons} {PD} {}





\inputexample{}
{

Let $K$ denote a field, and let
\mathrelationchain
{\relationchain
{ n }
{ \in }{ \N_+ }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Then the product set
\mathrelationchaindisplay
{\relationchain
{ K^n }
{ =} { \underbrace{K \times \cdots \times K }_{n\text{-times} } }
{ =} { { \left\{ (x_1 , \ldots , x_{ n }) \mid x_i \in K \right\} } }
{ } {}
{ } {}
} {}{}{,} with componentwise addition and with scalar multiplication given by
\mathrelationchaindisplay
{\relationchain
{ s (x_1 , \ldots , x_{ n }) }
{ =} { (s x_1 , \ldots , s x_{ n }) }
{ } { }
{ } { }
{ } { }
} {}{}{,} is a vector space. This space is called the $n$-dimensional \keyword {standard space} {.} In particular,
\mathrelationchain
{\relationchain
{K^1 }
{ = }{K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} is a vector space.

}

The null space $0$, consisting of just one element $0$, is a vector space. It might be considered as
\mathrelationchain
{\relationchain
{K^0 }
{ = }{0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

The vectors in the standard space $K^n$ can be written as row vectors
\mathdisp {\left( a_1 , \, a_2 , \, \ldots , \, a_n \right)} { }
or as column vectors
\mathdisp {\begin{pmatrix} a_1 \\a_2\\ \vdots\\a_n \end{pmatrix}} { . }
The vector
\mathrelationchaindisplay
{\relationchain
{ e_i }
{ \defeq} { \begin{pmatrix} 0 \\ \vdots\\ 0\\1\\ 0\\ \vdots\\ 0 \end{pmatrix} }
{ } { }
{ } { }
{ } { }
} {}{}{,} where the $1$ is at the $i$-th position, is called $i$-th \keyword {standard vector} {.}


\inputexample{}
{

Let $E$ be a \quotationshort{plane}{} with a fixed \quotationshort{origin}{}
\mathrelationchain
{\relationchain
{ Q }
{ \in }{ E }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} We identify a point
\mathrelationchain
{\relationchain
{ P }
{ \in }{E }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} with the connecting vector \mathl{\overrightarrow{ Q P }}{} \extrabracket {the arrow from $Q$ to $P$} {} {.} In this situation, we can introduce an intuitive coordinate-free vector addition and a coordinate-free scalar multiplication. Two vectors \mathcor {} {\overrightarrow{ Q P }} {and} {\overrightarrow{ Q R }} {} are added together by constructing the parallelogram of these vectors. The result of the addition is the corner of the parallelogram which lies in opposition to $Q$. In this construction, we have to draw a line parallel to \mathl{\overrightarrow{ Q P }}{} through $R$ and a line parallel to \mathl{\overrightarrow{ Q R }}{} through $P$. The intersection point is the point sought after. An accompanying idea is that we move the vector \mathl{\overrightarrow{ Q P }}{} in a parallel way so that the new starting point of \mathl{\overrightarrow{ Q P }}{} becomes the ending point of \mathl{\overrightarrow{ Q R }}{.}

In order to describe the multiplication of a vector \mathl{\overrightarrow{ Q P }}{} with a scalar $s$, this number has to be given on a line $G$ that is also marked with a zero point
\mathrelationchain
{\relationchain
{ 0 }
{ \in }{ G }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} and a unit point
\mathrelationchain
{\relationchain
{ 1 }
{ \in }{ G }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} This line lies somewhere in the plane. We move this line \extrabracket {by translating and rotating} {} {} in such a way that $0$ becomes $Q$, and we avoid that the line is identical with the line given by \mathl{\overrightarrow{ Q P }}{} \extrabracket {which we call $H$} {} {.}

Now we connect $1$ and $P$ with a line $L$, and we draw the line $L'$ parallel to $L$ through $s$. The intersecting point of \mathcor {} {L'} {and} {H} {} is \mathl{s \overrightarrow{ Q P }}{.}

These considerations can also be done in higher dimensions, but everything takes place already in the plane spanned by these vectors.

}




\inputexample{ }
{

The complex numbers $$ form a field, and therefore they form also a vector space over the field $\Complex$ itself. However, the set of complex numbers equals $\R^2$ as an additive group. The multiplication of a complex number \mathl{a+b { \mathrm i}}{} with a real number
\mathrelationchain
{\relationchain
{ s }
{ = }{ (s,0) }
{ }{ }
{ }{ }
{ }{}
} {}{}{} is componentwise, so this multiplication coincides with the scalar multiplication on $\R^2$. Hence, the set of complex numbers is also a real vector space.

}




\inputexample{}
{

For a field $K$, and given natural numbers \mathl{m,n}{,} the set
\mathdisp {\operatorname{Mat}_{ m \times n } (K)} { }
of all \mathl{m \times n}{-}matrices, endowed with componentwise addition and componentwise scalar multiplication, is a $K$-vector space. The null element in this vector space is the \keyword {null matrix} {}
\mathrelationchaindisplay
{\relationchain
{0 }
{ =} { \begin{pmatrix} 0 & \ldots & 0 \\ \vdots & \ddots & \vdots \\0 & \ldots & 0 \end{pmatrix} }
{ } { }
{ } { }
{ } { }
} {}{}{.}

}

We will introduce polynomials later, they are probably known from school.


\inputexample{}
{

Let
\mathrelationchain
{\relationchain
{ R }
{ = }{ K[X] }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} be the polynomial ring in one variable over the field $K$, consisting of all polynomials, that is, expressions of the form
\mathdisp {a_nX^n+a_{n-1}X^{n-1} + \cdots + a_2X^2+a_1X+a_0} { , }
with
\mathrelationchain
{\relationchain
{ a_i }
{ \in }{ K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} Using componentwise addition and componentwise multiplication with a scalar
\mathrelationchain
{\relationchain
{s }
{ \in }{ K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} \extrabracket {this is also multiplication with the constant polynomial $s$} {} {,} the polynomial ring is a $K$-vector space.

}




\inputexample{}
{

We consider the inclusion
\mathrelationchain
{\relationchain
{ \Q }
{ \subseteq }{ \R }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} of the rational numbers inside the real numbers. Using the real addition and the multiplication of rational numbers with real numbers, we see that $\R$ is a $\Q$-vector space, as follows directly from the field axioms. This is a quite crazy vector space.

}




\inputfactproof
{Vector space/Simple properties/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space.}
\factsegue {Then the following properties hold \extrabracket {for \mathcor {} {v \in V} {and} {s \in K} {}} {} {.}}
\factconclusion {\enumerationfour {We have
\mathrelationchain
{\relationchain
{ 0v }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}

} {We have
\mathrelationchain
{\relationchain
{ s 0 }
{ = }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} } {We have
\mathrelationchain
{\relationchain
{ (-1) v }
{ = }{ -v }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} } {If
\mathrelationchain
{\relationchain
{ s }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} and
\mathrelationchain
{\relationchain
{ v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} then
\mathrelationchain
{\relationchain
{ s v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} }}
\factextra {}

}
{See Exercise 6.27 .}






\subtitle {Linear subspaces}




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. A subset
\mathrelationchain
{\relationchain
{ U }
{ \subseteq }{ V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} is called a \definitionword {linear subspace}{} if the following properties hold. \enumerationthree {
\mathrelationchain
{\relationchain
{ 0 }
{ \in }{ U }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} } {If
\mathrelationchain
{\relationchain
{ u,v }
{ \in }{U }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} then also
\mathrelationchain
{\relationchain
{ u+v }
{ \in }{U }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} } {If
\mathrelationchain
{\relationchain
{ u }
{ \in }{ U }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} and
\mathrelationchain
{\relationchain
{ s }
{ \in }{ K }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} then also
\mathrelationchain
{\relationchain
{ s u }
{ \in }{ U }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} holds.

}

}

Addition and scalar multiplication can be restricted to such a linear subspace. Hence, the linear subspace is itself a vector space, see Exercise 6.10 . The simplest linear subspaces in a vector space $V$ are the null space $0$ and the whole vector space $V$.




\inputfactproof
{System of linear equations/Set of variables/Solution space is vector space/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ be a field, and let
\mathdisp {\begin{matrix} a _{ 1 1 } x _1 + a _{ 1 2 } x _2 + \cdots + a _{ 1 n } x _{ n } & = & 0 \\ a _{ 2 1 } x _1 + a _{ 2 2 } x _2 + \cdots + a _{ 2 n } x _{ n } & = & 0 \\ \vdots & \vdots & \vdots \\ a _{ m 1 } x _1 + a _{ m 2 } x _2 + \cdots + a _{ m n } x _{ n } & = & 0 \end{matrix}} { }
be a homogeneous system of linear equations over $K$.}
\factconclusion {Then the set of all solutions to the system is a linear subspace of the standard space $K^n$.}
\factextra {}

}
{See Exercise 6.3 .}


Therefore, we talk about the \keyword {solution space} {} of the linear system. In particular, the sum of two solutions of a system of linear equations is again a solution. The solution set of an inhomogeneous linear system is not a vector space. However, one can add, to a solution of an inhomogeneous system, a solution of the corresponding homogeneous system, and get a solution of the inhomogeneous system again.




\inputexample{}
{

We take a look at the homogeneous version of Example 5.1 , so we consider the homogeneous linear system
\mathdisp {\begin{matrix} 2x & +5y & +2z & & -v & = & 0 \\ \, 3x & -4y & & +u & +2v & = & 0 \\ \, 4x & & -2z & +2u & & = & 0 \, \end{matrix}} { }
over $\R$. Due to Lemma 6.11 , the solution set $L$ is a linear subspace of $\R^5$. We have described it explicitly in Example 5.1 as
\mathdisp {{ \left\{ u { \left(- { \frac{ 1 }{ 3 } }, 0 , { \frac{ 1 }{ 3 } } ,1,0\right) } + v { \left(- { \frac{ 2 }{ 13 } }, { \frac{ 5 }{ 13 } }, -{ \frac{ 4 }{ 13 } },0,1\right) } \mid u,v \in \R \right\} }} { . }
This description also shows that the solution set is a vector space. Moreover, with this description, it is clear that $L$ is in bijection with $\R^2$, and this bijection respects the addition and also the scalar multiplication \extrabracket {the solution set $L'$ of the inhomogeneous system is also in bijection with $\R^2$, but there is no reasonable addition nor scalar multiplication on $L'$} {} {.} However, this bijection depends heavily on the chosen \quotationshort{basic solutions}{} \mathcor {} {{ \left(- { \frac{ 1 }{ 3 } }, 0 , { \frac{ 1 }{ 3 } } ,1,0\right) }} {and} {{ \left(- { \frac{ 2 }{ 13 } }, { \frac{ 5 }{ 13 } }, -{ \frac{ 4 }{ 13 } },0,1\right) }} {,} which depends on the order of elimination. There are several equally good basic solutions for $L$.

}

This example shows also the following: the solution space of a linear system over $K$ is \quotationshort{in a natural way}{,} that means, independent on any choice, a linear subspace of $K^n$ \extrabracket {where $n$ is the number of variables} {} {.} For this solution space, there always exists a \quotationshort{linear bijection}{} (an \quotationshort{isomorphism}{}) to some \mathl{K^{d}}{} \extrabracket {\mathrelationchainb
{\relationchainb
{d }
{ \leq }{n }
{ }{ }
{ }{ }
{ }{ }
} {}{}{}} {} {,} but there is no natural choice for such a bijection. This is one of the main reasons to work with abstract vector spaces, instead of just $K^n$.






\subtitle {Generating systems}

The solution set of a homogeneous linear system in $n$ variables over a field $K$ is a linear subspace of $K^n$. This solution space is often described as the set of all \quotationshort{linear combinations}{} of finitely many \extrabracket {simple} {} {} solutions. In this and the next lecture, we will develop the concepts to make this precise.






\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {VectorGenerado.gif} }
\end{center}
\imagetext {The plane generated by two vectors $v_1$ and $v_2$ consists of all linear combinations
\mathrelationchain
{\relationchain
{ u }
{ = }{ s v_1+ t v_2 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.}} }

\imagelicense { VectorGenerado.gif } {} {Marianov} {Commons} {PD} {}




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. Let \mathl{v_1 , \ldots , v_n}{} denote a family of vectors in $V$. Then the vector
\mathdisp {s_1v_1+s_2v_2 + \cdots + s_nv_n \text{ with } s_i \in K} { }
is called a \definitionword {linear combination}{} of this vectors

\extrabracket {for the \keyword {coefficient tuple} {} $(s_1 , \ldots , s_n)$} {} {.}

}

Two different coefficient tuples can define the same vector.




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. A family
\mathcond {v_i \in V} {}
{i \in I} {}
{} {} {} {,} is called a \definitionword {generating system}{} \extrabracket {or \definitionword {spanning system}{}} {} {} of $V$, if every vector
\mathrelationchain
{\relationchain
{v }
{ \in }{V }
{ }{ }
{ }{ }
{ }{ }
} {}{}{} can be written as
\mathrelationchaindisplay
{\relationchain
{v }
{ =} {\sum_{j \in J} s_j v_j }
{ } { }
{ } { }
{ } { }
} {}{}{,} with a finite subfamily
\mathrelationchain
{\relationchain
{J }
{ \subseteq }{I }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} and with
\mathrelationchain
{\relationchain
{ s_j }
{ \in }{ K }
{ }{ }
{ }{ }
{ }{ }
}

{}{}{.}

}

In $K^n$, the standard vectors
\mathcond {e_i} {}
{1 \leq i \leq n} {}
{} {} {} {,} form a generating system. In the polynomial ring \mathl{K[X]}{,} the powers
\mathcond {X^n} {}
{n \in \N} {}
{} {} {} {,} form an \extrabracket {infinite} {} {} generating system.




\inputdefinition
{ }
{

Let $K$ be a field, and let $V$ be a $K$-vector space. For a family
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} we set
\mathrelationchaindisplay
{\relationchain
{ \langle v_i ,\, i \in I \rangle }
{ =} { { \left\{ \sum_{i \in J} s_i v_i \mid s_i \in K , \, J \subseteq I \text{ finite subset} \right\} } }
{ } { }
{ } { }
{ } { }
} {}{}{,}

and call this the \definitionword {linear span}{} of the family, or the \definitionword {generated linear subspace}{.}

}

The empty set generates the null space\extrafootnote {This follows from the definition, if we use the convention that the empty sum equals $0$} {.} {.} The null space is also generated by the element $0$. A single vector $v$ spans the space
\mathrelationchain
{\relationchain
{ Kv }
{ = }{ { \left\{ s v \mid s \in K \right\} } }
{ }{ }
{ }{ }
{ }{ }
} {}{}{.} For
\mathrelationchain
{\relationchain
{v }
{ \neq }{ 0 }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} this is a \keyword {line} {,} a term we will make more precise in the framework of dimension theory. For two vectors \mathcor {} {v} {and} {w} {,} the \quotationshort{form}{} of the spanned space depends on how the two vectors are related to each other. If they both lie on a line, say
\mathrelationchain
{\relationchain
{w }
{ = }{ s v }
{ }{ }
{ }{ }
{ }{ }
} {}{}{,} then $w$ is superfluous, and the linear subspace generated by the two vectors equals the linear subspace generated by $v$. If this is not the case \extrabracket {and \mathcor {} {v} {and} {w} {} are not $0$} {} {,} then the two vectors span a \quotationshort{plane}{.}

We list some simple properties for generating systems and linear subspaces.




\inputfactproof
{Vector space/Generating system and spanning subspace/Fact}
{Lemma}
{}
{

\factsituation {Let $K$ be a field, and let $V$ be a $K$-vector space.}
\factsegue {Then the following hold.}
\factconclusion {\enumerationthree {Let
\mathcond {U_j} {}
{j \in J} {}
{} {} {} {,} be a family of linear subspaces of $V$. Then the intersection
\mathrelationchaindisplay
{\relationchain
{ U }
{ =} { \bigcap_{j \in J} U_j }
{ } { }
{ } { }
{ } { }
} {}{}{} is a linear subspace. } {Let
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} be a family of elements of $V$, and consider the subset $W$ of $V$ which is given by all linear combinations of these elements. Then $W$ is a linear subspace of $V$. } {The family
\mathcond {v_i} {}
{i \in I} {}
{} {} {} {,} is a system of generators of $V$ if and only if
\mathrelationchaindisplay
{\relationchain
{ \langle v_i ,\, i\in I \rangle }
{ =} { V }
{ } { }
{ } { }
{ } { }
} {}{}{.} }}
\factextra {}

}
{See Exercise 6.26 .}