Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 27/latex
\setcounter{section}{27}
\subtitle {Eigenvalues and eigenvectors}
For a reflection at an axis in the plane, certain vectors behave particularly simply. The vectors on the axis are sent to themselves, and the vectors which are orthogonal to the axis are sent to their negatives. For all these vectors, the image under this linear mapping lies on the line spanned by these vectors. In the theory of eigenvalues and eigenvectors, we want to know whether, for a given linear mapping, there exist lines \extrabracket {one-dimensional linear subspaces} {} {,} which are mapped to themselves. The goal is to find, for the linear mapping, a basis such that the describing matrix is quite simple. Here, an important application is to find solutions for a system of linear differential equations.
\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {Simetria_axial.png} }
\end{center}
\imagetext {A \keyword {reflection at an axis} {} has two eigenlines, the axis with eigenvalue $1$ and the orthogonal line with eigenvalue $-1$.} }
\imagelicense { Simetria axial.png } {} {Rovnet} {Commons} {CC-by-sa 3.0} {}
\inputdefinition
{ }
{
Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
Then an element
\mathcond {v \in V} {}
{v \neq 0} {}
{} {} {} {,}
is called an \definitionword {eigenvector}{} of $\varphi$
\extrabracket {for the
eigenvalue
$\lambda$} {} {,}
if
\mathrelationchaindisplay
{\relationchain
{ \varphi(v)
}
{ =} { \lambda v
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
for some
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
}
\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {VerticalShear_m_1_25.svg} }
\end{center}
\imagetext {A \keyword {shear mapping} {} has one eigenline for the eigenvalue $1$, and no further eigenvalues.} }
\imagelicense { VerticalShear_m%3D1.25.svg } {} {RobHar} {Commons} {PD} {}
\inputdefinition
{ }
{
Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
Then an element
\mathrelationchain
{\relationchain
{\lambda
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is called an \definitionword {eigenvalue}{} for $\varphi$, if there exists a vector
\mathrelationchain
{\relationchain
{v
}
{ \in }{V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
\mathrelationchain
{\relationchain
{ v
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
such that
\mathrelationchaindisplay
{\relationchain
{ \varphi(v)
}
{ =} {\lambda v
}
{ } {
}
{ } {
}
{ } {
}
}
}
\inputdefinition
{ }
{
Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
For
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we denote by
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ \lambda } { \left( \varphi \right) }
}
{ \defeq} { { \left\{ v \in V \mid \varphi(v) = \lambda v \right\} }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
}
Thus we allow arbitrary values \extrabracket {not only eigenvalues} {} {} in the definition of an eigenspace. We will see in Fact ***** that they are linear subspaces. In particular, $0$ belongs to every eigenspace, though it is never an eigenvector. The linear subspace generated by an eigenvector is called an \keyword {eigenline} {.} For most \extrabracket {in fact all up to finitely many, in case the vector space has finite dimension} {} {} $\lambda$, the eigenspace is just the zero space.
We consider some easy examples over $\R$.
\inputexample{}
{
A linear mapping from $\R$ to $\R$ is the multiplication with a fixed number
\mathrelationchain
{\relationchain
{a
}
{ \in }{\R
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
\extrabracket {the \keyword {proportionality factor} {}} {} {.}
Therefore, every number
\mathrelationchain
{\relationchain
{v
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is an
eigenvector
for the
eigenvalue
$a$, and the
eigenspace
for this eigenvalue is the whole $\R$. Beside $a$, there are no other eigenvalues, and all eigenspaces for
\mathrelationchain
{\relationchain
{ \lambda
}
{ \neq }{a
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
are $0$.
}
\inputexample{}
{
A
linear mapping
from $\R^2$ to $\R^2$ is described by a
$2\times 2$-matrix
with respect to the
standard basis.
We consider the eigenvalues for some elementary examples. A
homothety
is given as \mathl{v \mapsto av}{,} with a scaling factor
\mathrelationchain
{\relationchain
{a
}
{ \in }{ \R
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Every vector
\mathrelationchain
{\relationchain
{v
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is an
eigenvector
for the
eigenvalue
$a$, and the eigenspace for this eigenvalue is the whole $\R^2$. Beside $a$, there are no other eigenvalues, and all eigenspaces for
\mathrelationchain
{\relationchain
{ \lambda
}
{ \neq }{a
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
are $0$. The identity only has the eigenvalue $1$.
The
reflection
at the $x$-axis is described by the matrix \mathl{\begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}}{.} The eigenspace for the eigenvalue $1$ is the $x$-axis, the eigenspace for the eigenvalue $-1$ is the $y$-axis. A vector \mathl{(s,t)}{} with
\mathrelationchain
{\relationchain
{s,t
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is not an eigenvector, since the equation
\mathrelationchaindisplay
{\relationchain
{ (s,-t)
}
{ =} { \lambda (s,t)
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
does not have a solution.
A
plane rotation
is described by a rotation matrix \mathl{\begin{pmatrix}
\operatorname{cos} \, \alpha & - \operatorname{sin} \, \alpha \\
\operatorname{sin} \, \alpha & \operatorname{cos} \,\alpha
\end{pmatrix}}{} for the rotation angle
\mathcond {\alpha} {}
{0 \leq \alpha <2 \pi} {}
{} {} {} {}
For
\mathrelationchain
{\relationchain
{\alpha
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
this is the identity, for
\mathrelationchain
{\relationchain
{\alpha
}
{ = }{ \pi
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
this is a half rotation, which is the reflection at the origin or the homothety with factor $-1$. For all other rotation angles, there is no line sent to itself, so that these rotations have no eigenvalue and no eigenvector
\extrabracket {and all eigenspaces are $0$} {} {.}
}
\inputfactproof
{Endomorphism/Eigenspaces are linear subspaces/Zero/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.}
\factsegue {Then the following statements hold.}
\factconclusion {\enumerationthree {Every
eigenspace
\mathdisp {\operatorname{Eig}_{ \lambda } { \left( \varphi \right) }} { }
is a
linear subspace
of $V$.
} {$\lambda$ is an
eigenvalue
for $\varphi$, if and only if the eigenspace \mathl{\operatorname{Eig}_{ \lambda } { \left( \varphi \right) }}{} is not the
null space.
} {A vector \mathl{v \in V, \, v \neq 0}{,} is an
eigenvector
for $\lambda$, if and only if
\mathrelationchain
{\relationchain
{v
}
{ \in }{ \operatorname{Eig}_{ \lambda } { \left( \varphi \right) }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
}}
\factextra {}
{See Exercise 27.14 .}
For matrices, we use the same concepts. If
$\varphi \colon V \rightarrow V$
is a linear mapping, and $M$ is a describing matrix with respect to a basis, then for an eigenvalue $\lambda$ and an eigenvector
\mathrelationchain
{\relationchain
{v
}
{ \in }{ V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
with corresponding coordinate tuple \mathl{\begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}}{} with respect to the basis, we have the relation
\mathrelationchaindisplay
{\relationchain
{ M \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ =} { \lambda \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
The describing matrix $N$ with respect to another basis satisfies, due to
to Lemma 25.8
,
the relation
\mathrelationchain
{\relationchain
{ N
}
{ = }{ BMB^{-1}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
where $B$ is an invertible matrix. Let
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} x'_{1 } \\ \vdots\\ x'_{ n } \end{pmatrix}
}
{ =} { B \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
denote the coordinate tuple with respect to the second basis. Then
\mathrelationchainalign
{\relationchainalign
{ N\begin{pmatrix} x'_{1 } \\ \vdots\\ x'_{ n } \end{pmatrix}
}
{ =} { (BMB^{-1}) \begin{pmatrix} x'_{1 } \\ \vdots\\ x'_{ n } \end{pmatrix}
}
{ =} { (BM B^{-1}) B \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ =} { BM \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ =} { B \lambda \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
}
{
\relationchainextensionalign
{ =} { \lambda B \begin{pmatrix} x_{1 } \\ \vdots\\ x_{ n } \end{pmatrix}
}
{ =} { \lambda \begin{pmatrix} x'_{1 } \\ \vdots\\ x'_{ n } \end{pmatrix}
}
{ } {}
{ } {}
}
{}{,}
i.e., the describing matrices have the same eigenvalues, but the coordinate tuples for the eigenvectors are different.
\inputexample{}
{
We consider the
linear mapping
\mathdisp {\varphi \colon K^n \longrightarrow K^n
, e_i \longmapsto d_ie_i} { , }
given by the
diagonal matrix
\mathdisp {\begin{pmatrix} d_1 & 0 & \cdots & \cdots & 0 \\ 0 & d_2 & 0 & \cdots & 0 \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & \cdots & 0 & d_{ n-1} & 0 \\ 0 & \cdots & \cdots & 0 & d_{ n } \end{pmatrix}} { . }
The diagonal entries $d_i$ are the
eigenvalues
of $\varphi$, and the $i$-th standard vector $e_i$ is a corresponding
eigenvector.
The eigenspaces are
\mathrelationchainalign
{\relationchainalign
{ \operatorname{Eig}_{ d } { \left( \varphi \right) }
}
{ =} { { \left\{ v \in K^n \mid v \text{ is a linear combination of those } e_i, \text{ for which } d = d_i \text{ holds} \right\} }
}
{ } {
}
{ } {
}
{ } {
}
}
{}
{}{.}
These spaces are not $0$ if and only if $d$ equals one of the diagonal entries. The dimension of the eigenspace \mathl{\operatorname{Eig}_{ d } { \left( \varphi \right) }}{} is given by the number how often the value $d$ occurs in the diagonal. The sum of all these dimension gives $n$.
}
\inputexample{}
{
For an \keyword {orthogonal reflection} {} of $\R^n$, there exists an \mathl{(n-1)}{-}dimensional linear subspace
\mathrelationchain
{\relationchain
{U
}
{ \subseteq }{ \R^n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
which is fixed by the mapping and every vector orthogonal to $U$ is sent to its negative. If \mathl{v_1 , \ldots , v_{n-1}}{} is a basis of $U$ and $v_n$ is a vector orthogonal to $U$, then the reflection is described by the matrix
\mathdisp {\begin{pmatrix} 1 & 0 & \cdots & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & \cdots & 0 & 1 & 0 \\ 0 & \cdots & \cdots & 0 & -1 \end{pmatrix}} { }
with respect to this basis.
}
\inputexample{}
{
We consider the
linear mapping
\mathdisp {\varphi \colon \Q^2 \longrightarrow \Q^2
, \begin{pmatrix} x \\y \end{pmatrix} \longmapsto \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix} = \begin{pmatrix} 5y \\x \end{pmatrix}} { , }
given by the matrix
\mathrelationchaindisplay
{\relationchain
{M
}
{ =} { \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
The question whether this mapping has
eigenvalues,
leads to the question whether there exists some
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{\Q
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
such that the equation
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix}
}
{ =} { \lambda \begin{pmatrix} x \\y \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
has a nontrivial solution
\mathrelationchain
{\relationchain
{ (x,y)
}
{ \neq }{ (0,0)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
For a given $\lambda$, this is a linear problem and can be solved with the elimination algorithm. However, the question whether there exist eigenvalues at all, leads, due to the variable \quotationshort{eigenvalue parameter}{} $\lambda$, to a nonlinear problem. The system of equations above is
\mathdisp {5y = \lambda x \text{ and } x = \lambda y} { . }
For
\mathrelationchain
{\relationchain
{y
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we get
\mathrelationchain
{\relationchain
{x
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
but the null vector is not an eigenvector. Hence, suppose that
\mathrelationchain
{\relationchain
{y
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Both equations combined yield the condition
\mathrelationchaindisplay
{\relationchain
{5y
}
{ =} { \lambda x
}
{ =} { \lambda^2 y
}
{ } {
}
{ } {
}
}
{}{}{,}
hence
\mathrelationchain
{\relationchain
{5
}
{ = }{ \lambda^2
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
But in $\Q$, the number $5$ does not have a
square root,
therefore there is no solution, and that means that $\varphi$ has no eigenvalues and no
eigenvectors.
Now we consider the matrix $M$ as a real matrix, and look at the corresponding mapping
\mathdisp {\psi \colon \R^2 \longrightarrow \R^2
, \begin{pmatrix} x \\y \end{pmatrix} \longmapsto \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix} = \begin{pmatrix} 5y \\x \end{pmatrix}} { . }
The same computations as above lead to the condition
\mathrelationchain
{\relationchain
{ 5
}
{ = }{ \lambda^2
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
and within the real numbers, we have the two solutions
\mathdisp {\lambda_1 = \sqrt{5} \text{ and } \lambda_2 = - \sqrt{5}} { . }
For both values, we have now to find the eigenvectors. First, we consider the case
\mathrelationchain
{\relationchain
{ \lambda
}
{ = }{\sqrt{5}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
which yields the linear system
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix}
}
{ =} { \sqrt{5} \begin{pmatrix} x \\y \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
We write this as
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix}
}
{ =} { \begin{pmatrix} \sqrt{5} & 0 \\ 0 & \sqrt{5} \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
and as
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} \sqrt{5} & -5 \\ -1 & \sqrt{5} \end{pmatrix} \begin{pmatrix} x \\y \end{pmatrix}
}
{ =} { \begin{pmatrix} 0 \\0 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
This system can be solved easily, the solution space has dimension one, and
\mathrelationchaindisplay
{\relationchain
{v
}
{ =} { \begin{pmatrix} \sqrt{5} \\1 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a basic solution.
For
\mathrelationchain
{\relationchain
{ \lambda
}
{ = }{ - \sqrt{5 }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we do the same steps, and the vector
\mathrelationchaindisplay
{\relationchain
{w
}
{ =} { \begin{pmatrix} -\sqrt{5} \\1 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a basic solution. Thus over $\R$, the numbers
\mathcor {} {\sqrt{5}} {and} {- \sqrt{5}} {}
are eigenvalues, and the corresponding
eigenspaces
are
\mathdisp {\operatorname{Eig}_{ \sqrt{5} } { \left( \psi \right) } = { \left\{ s \begin{pmatrix} \sqrt{5} \\1 \end{pmatrix} \mid s \in \R \right\} } \text{ and } \operatorname{Eig}_{ -\sqrt{5} } { \left( \psi \right) } = { \left\{ s \begin{pmatrix} - \sqrt{5} \\1 \end{pmatrix} \mid s \in \R \right\} }} { . }
}
\subtitle {Eigenspaces}
\inputfactproof
{Linear mapping/Eigenvalue zero/Characterization/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.}
\factconclusion {Then
\mathrelationchaindisplay
{\relationchain
{ \operatorname{ker} { \left( \varphi\right) }
}
{ =} { \operatorname{Eig}_{ 0 } { \left( \varphi \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {In particular, $0$ is an
eigenvalue
of $\varphi$ if and only if $\varphi$ is not
injective.}
{See Exercise 27.17 .}
More general, we have the following characterization.
\inputfactproof
{Linear mapping/Eigenspace as kernel/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
Let
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factconclusion {Then
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ \lambda } { \left( \varphi \right) }
}
{ =} { \operatorname{ker} { \left( \lambda \cdot
\operatorname{Id}_{ V } - \varphi \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {}
}
{
Let
\mathrelationchain
{\relationchain
{v
}
{ \in }{V
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Then
\mathrelationchain
{\relationchain
{v
}
{ \in }{ \operatorname{Eig}_{ \lambda } { \left( \varphi \right) }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
if and only if
\mathrelationchain
{\relationchain
{ \varphi(v)
}
{ = }{ \lambda v
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
and this is the case if and only if
\mathrelationchain
{\relationchain
{ \lambda v - \varphi(v)
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
holds, which means
\mathrelationchain
{\relationchain
{ { \left( \lambda \cdot
\operatorname{Id}_{ V } - \varphi \right) } (v)
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
\inputremark {}
{
Beside the
eigenspace
for
\mathrelationchain
{\relationchain
{0
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
which is the
kernel
of the linear mapping, the eigenvalues
\mathcor {} {1} {and} {-1} {}
are in particular interesting. The eigenspace for $1$ consists of all vectors which are sent to themselves. Restricted to this linear subspace, the mapping is just the identity, it is called the \keyword {fixed space} {.} The eigenspace for $-1$ consists in all vector which are sent to their negative. On this linear subspace, the mapping acts like the reflection at the origin.
}
\inputfactproof
{Linear mapping/Eigenspace for different eigenvalues/Zero/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
Let
\mathrelationchain
{\relationchain
{ \lambda_1
}
{ \neq }{ \lambda_2
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
be elements in $K$.}
\factconclusion {Then
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ \lambda_1 } { \left( \varphi \right) } \cap \operatorname{Eig}_{ \lambda_2 } { \left( \varphi \right) }
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {}
{See Exercise 27.19 .}
\inputfactproof
{Endomorphism/Eigenvectors/Linearly independent/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.
Let \mathl{v_1 , \ldots , v_n}{} be
eigenvectors
for
\extrabracket {pairwise} {} {}
different
eigenvalues
\mathrelationchain
{\relationchain
{ \lambda_1 , \ldots , \lambda_n
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factconclusion {Then \mathl{v_1 , \ldots , v_n}{} are
linearly independent.}
\factextra {}
}
{
We prove the statement by induction on $n$. For
\mathrelationchain
{\relationchain
{n
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the statement is true. Suppose now that the statement is true for less than $n$ vectors. We consider a representation of $0$, say
\mathrelationchaindisplay
{\relationchain
{ a_1v_1 + \cdots + a_nv_n
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
We apply $\varphi$ to this and get, on one hand,
\mathrelationchaindisplay
{\relationchain
{ a_1 \varphi(v_1) + \cdots + a_n \varphi(v_n)
}
{ =} { \lambda_1 a_1v_1 + \cdots + \lambda_n a_nv_n
}
{ =} { 0
}
{ } {
}
{ } {
}
}
{}{}{.}
On the other hand, we multiply the equation with $\lambda_{n}$ and get
\mathrelationchaindisplay
{\relationchain
{ \lambda_n a_1v_1 + \cdots + \lambda_n a_nv_n
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
We look at the difference of the two equations, and get
\mathrelationchaindisplay
{\relationchain
{ (\lambda_{n} - \lambda_1) a_1v_1 + \cdots + (\lambda_{n} - \lambda_{n-1}) a_{n-1} v_{n-1}
}
{ =} { 0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
By the induction hypothesis, we get for the coefficients
\mathcond {(\lambda_n - \lambda_i)a_i=0} {}
{i = 1 , \ldots , n-1} {}
{} {} {} {.}
Because of
\mathrelationchain
{\relationchain
{ \lambda_n - \lambda_i
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we get
\mathcond {a_i=0} {for}
{i = 1 , \ldots , n-1} {}
{} {} {} {,}
and because of
\mathrelationchain
{\relationchain
{v_n
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we also get
\mathrelationchain
{\relationchain
{a_n
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
\inputfactproof
{Linear mapping/Finite dimensional/Eigenvalues bounded by dimension/Fact}
{Corollary}
{}
{
\factsituation {Let $K$ be a
field,
$V$ a
finite-dimensional
$K$-vector space
and
\mathdisp {\varphi \colon V \longrightarrow V} { }
a
linear mapping.}
\factconclusion {Then there exist at most \mathl{\dim_{ K } { \left( V \right) }}{} many
eigenvalues
for $\varphi$.}
\factextra {}
{See Exercise 27.20 .}