Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 28/latex
\setcounter{section}{28}
\subtitle {The characteristic polynomial}
We want to determine, for a given endomorphism $\varphi \colon V \rightarrow V$, the eigenvalues and the eigenspaces. For this, the characteristic polynomial is decisive.
\inputdefinition
{ }
{
For an
$n \times n$-matrix
$M$ with entries in a
field
$K$, the
polynomial
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ \defeq} {\det { \left( X \cdot E_{ n } - M \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is called the \definitionword {characteristic polynomial}{}
}
For
\mathrelationchain
{\relationchain
{M
}
{ = }{ { \left( a_{ij} \right) }_{ij}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
this means
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ =} { \det \begin{pmatrix} X-a_{11} & -a_{12} & \ldots & -a_{1n} \\ -a_{21} & X-a_{22} & \ldots & -a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ -a_{n1} & -a_{n2} & \ldots & X-a_{nn} \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
In this definition, we use the determinant of a matrix, which we have only defined for matrices with entries in a field. The entries are now elements of the polynomial ring \mathl{K[X]}{.} But, since we can consider these elements also inside the
field of rational functions
\mathl{K(X)}{\extrafootnote {\mathlk{K(X)}{} is called the field of rational polynomials; it consists of all fractions \mathl{P/Q}{} for polynomials
\mathrelationchain
{\relationchain
{ P,Q
}
{ \in }{ K [X]
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
with
\mathrelationchain
{\relationchain
{ Q
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
For
\mathrelationchain
{\relationchain
{ K
}
{ = }{ \R
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
or $\C$, this field can be identified with the field of rational functions} {.} {,}}
this is a useful definition. By definition, the determinant is an element in \mathl{K(X)}{,} but, because all entries of the matrix are polynomials, and because in the recursive definition of the determinant, only addition and multiplication is used, the characteristic polynomial is indeed a polynomial. The degree of the characteristic polynomial is $n$, and its leading coefficient is $1$, so it has the form
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ =} { X^n + c_{n-1}X^{n-1} + \cdots + c_1 X+c_0
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
We have the important relation
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M } (\lambda)
}
{ =} { \det { \left( \lambda E_{ n } - M \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
for every
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
see
Exercise 28.4
.
Here, on the left-hand side, the number $\lambda$ is inserted into the polynomial, and on the right-hand side, we have the determinant of a matrix which depends on $\lambda$.
For a linear mapping
\mathdisp {\varphi \colon V \longrightarrow V} { }
on a finite-dimensional vector space, the \keyword {characteristic polynomial} {} is defined by
\mathrelationchaindisplay
{\relationchain
{ \chi_{ \varphi }
}
{ \defeq} { \chi_{ M }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
where $M$ is a describing matrix with respect to some basis. The
multiplication theorem for the determinant
shows that this definition is independent of the choice of the basis, see
Exercise 28.3
.
The characteristic polynomial of the identity on an $n$-dimensional vector space is
\mathrelationchaindisplay
{\relationchain
{ \chi_{
\operatorname{Id} }
}
{ =} { \det { \left( XE_n-E_n \right) }
}
{ =} { (X-1)^n
}
{ =} { X^n- nX^{n-1} + \binom{n}{2} X^{n-2} - \binom{n}{3} X^{n-3} + \cdots \pm \binom{n}{2} X^{2} \mp n X \pm 1
}
{ } {
}
}
{}{}{.}
\inputfactproof
{Endomorphism/Eigenvalue and characteristic polynomial/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote an
$n$-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.}
\factconclusion {Then
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
is an
eigenvalue
of $\varphi$ if and only if $\lambda$ is a zero of the
characteristic polynomial
$\chi_{ \varphi }$.}
\factextra {}
}
{
Let $M$ denote a
describing matrix
for $\varphi$, and let
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
be given. We have
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }\, (\lambda)
}
{ =} { \det { \left( \lambda E_{ n } - M \right) }
}
{ =} { 0
}
{ } {
}
{ } {
}
}
{}{}{,}
if and only if the linear mapping
\mathdisp {\lambda
\operatorname{Id}_{ V } - \varphi} { }
is not
bijective
\extrabracket {and not
injective} {} {}
\extrabracket {due to
Theorem 26.11
and
Lemma 25.11
} {} {.}
This is, because of
Lemma 27.11
and
Lemma 24.14
,
equivalent with
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ \lambda } { \left( \varphi \right) }
}
{ =} { \operatorname{ker} { \left( ( \lambda
\operatorname{Id}_{ V } - \varphi)\right) }
}
{ \neq} { 0
}
{ } {
}
{ } {
}
}
{}{}{,}
and this means that the
eigenspace
for $\lambda$ is not the null space, thus $\lambda$ is an eigenvalue for $\varphi$.
\inputexample{}
{
We consider the real matrix
\mathrelationchain
{\relationchain
{M
}
{ = }{ \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
The
characteristic polynomial
is
\mathrelationchainalign
{\relationchainalign
{ \chi_{ M }
}
{ =} { \det { \left( x E_2 -M \right) }
}
{ =} { \det { \left( x \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} - \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \right) }
}
{ =} { \det \begin{pmatrix} x & -5 \\ -1 & x \end{pmatrix}
}
{ =} { x^2-5
}
}
{}
{}{.}
The eigenvalues are therefore
\mathrelationchain
{\relationchain
{x
}
{ = }{ \pm \sqrt{5}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
\extrabracket {we have found these eigenvalues already in
Example 27.9
,
without using the characteristic polynomial} {} {.}
}
\inputexample{}
{
For the matrix
\mathrelationchaindisplay
{\relationchain
{M
}
{ =} { \begin{pmatrix} 2 & 5 \\ -3 & 4 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
the
characteristic polynomial
is
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ =} { \det \begin{pmatrix} X-2 & -5 \\ 3 & X-4 \end{pmatrix}
}
{ =} { (X-2)(X-4) +15
}
{ =} { X^2 -6X +23
}
{ } {
}
}
{}{}{.}
Finding the zeroes of this polynomial leads to the condition
\mathrelationchaindisplay
{\relationchain
{ (X-3)^2
}
{ =} { -23 +9
}
{ =} { -14
}
{ } {
}
{ } {
}
}
{}{}{,}
which has no solution over $\R$, so that the matrix has no
eigenvalues
over $\R$. However, considered over the complex numbers $\Complex$, we have the two eigenvalues
\mathcor {} {3+\sqrt{14} { \mathrm i}} {and} {3 - \sqrt{14} { \mathrm i}} {.}
For the
eigenspace
for \mathl{3+\sqrt{14} { \mathrm i}}{,} we have to determine
\mathrelationchainalign
{\relationchainalign
{ \operatorname{Eig}_{ 3+\sqrt{14} { \mathrm i} } { \left( M \right) }
}
{ =} { \operatorname{ker} { \left( { \left( { \left( 3+ \sqrt{14} { \mathrm i} \right) } E_2 - M \right) } \right) }
}
{ =} { \operatorname{ker} { \left( \begin{pmatrix} 1 + \sqrt{14} { \mathrm i} & -5 \\ 3 & -1 + \sqrt{14} { \mathrm i} \end{pmatrix} \right) }
}
{ } {
}
{ } {
}
}
{}
{}{,}
a basis vector
\extrabracket {hence an eigenvector} {} {}
of this is \mathl{\begin{pmatrix} 5 \\1+ \sqrt{14} { \mathrm i} \end{pmatrix}}{.} Analogously, we get
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ 3 -\sqrt{14} { \mathrm i} } { \left( M \right) }
}
{ =} { \operatorname{ker} { \left( \begin{pmatrix} 1 - \sqrt{14} { \mathrm i} & -5 \\ 3 & -1 - \sqrt{14} { \mathrm i} \end{pmatrix} \right) }
}
{ =} { \langle \begin{pmatrix} 5 \\1 - \sqrt{14} { \mathrm i} \end{pmatrix} \rangle
}
{ } {
}
{ } {
}
}
{}{}{.}
}
\inputexample{}
{
For an
upper triangular matrix
\mathrelationchaindisplay
{\relationchain
{M
}
{ =} { \begin{pmatrix} d_1 & \ast & \cdots & \cdots & \ast \\ 0 & d_2 & \ast & \cdots & \ast \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & \cdots & 0 & d_{ n-1} & \ast \\ 0 & \cdots & \cdots & 0 & d_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
the
characteristic polynomial
is
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ =} { (X-d_1)(X-d_2) \cdots (X-d_n)
}
{ } {
}
{ } {
}
{ } {}
}
{}{}{,}
due to
Lemma 26.8
.
In this case, we have directly a factorization of the characteristic polynomial into linear factors, so that we can see immediately the zeroes and the
eigenvalues
of $M$, namely just the diagonal elements \mathl{d_1,d_2 , \ldots , d_n}{}
\extrabracket {which might not be all different} {} {.}
}
\subtitle {Multiplicities}
For a more detailed investigation of eigenspaces, the following concepts are necessary. Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a linear mapping on a finite-dimensional vector space $V$, and
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Then the exponent of the linear polynomial \mathl{X - \lambda}{} inside the characteristic polynomial $\chi_{ \varphi }$ is called the \keyword {algebraic multiplicity} {} of $\lambda$, symbolized as
\mathrelationchain
{\relationchain
{ \mu_\lambda
}
{ \defeq }{ \mu_\lambda(\varphi)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
The dimension of the corresponding eigenspace, that is
\mathdisp {\dim_{ K } { \left( \operatorname{Eig}_{ \lambda } { \left( \varphi \right) } \right) }} { , }
is called the \keyword {geometric multiplicity} {} of $\lambda$. Because of
Theorem 28.2
,
the algebraic multiplicity is positive if and only if the geometric multiplicity is positive. In general, these multiplicities might be different, we have however always one estimate.
\inputfactproof
{Endomorphism/Geometric and algebraic multiplicity/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote a
finite-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping
and
\mathrelationchain
{\relationchain
{ \lambda
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factconclusion {Then we have the estimate
\mathrelationchaindisplay
{\relationchain
{ \dim_{ K } { \left( \operatorname{Eig}_{ \lambda } { \left( \varphi \right) } \right) }
}
{ \leq} { \mu_\lambda(\varphi)
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
between the
geometric
and the
algebraic multiplicity.}
\factextra {}
}
{
Let
\mathrelationchain
{\relationchain
{m
}
{ = }{ \dim_{ K } { \left( \operatorname{Eig}_{ \lambda } { \left( \varphi \right) } \right) }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
and let \mathl{v_1 , \ldots , v_m}{} be a
basis
of this
eigenspace.
We complement this basis with \mathl{w_1 , \ldots , w_{n-m}}{} to get a basis of $V$, using
Theorem 23.23
.
With respect to this basis, the
describing matrix
has the form
\mathdisp {\begin{pmatrix} \lambda E_m & B \\ 0 & C \end{pmatrix}} { . }
Ttherefore, the
characteristic polynomial
equals
\extrabracket {using
Exercise 26.9
} {} {}
\mathl{(X- \lambda)^m \cdot \chi_{ C }}{,} so that the
algebraic multiplicity
is at least $m$.
\inputexample{}
{
We consider the \mathl{2\times 2}{-}\keyword {shearing matrix} {}
\mathrelationchaindisplay
{\relationchain
{ M
}
{ =} { \begin{pmatrix} 1 & a \\ 0 & 1 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
with
\mathrelationchain
{\relationchain
{a
}
{ \in }{K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
The
characteristic polynomial
is
\mathrelationchaindisplay
{\relationchain
{ \chi_{ M }
}
{ =} {(X-1)(X-1)
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
so that $1$ is the only
eigenvalue
of $M$. The corresponding
eigenspace
is
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ 1 } { \left( M \right) }
}
{ =} { \operatorname{ker} { \left( \begin{pmatrix} 0 & -a \\ 0 & 0 \end{pmatrix} \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
From
\mathrelationchaindisplay
{\relationchain
{ \begin{pmatrix} 0 & -a \\ 0 & 0 \end{pmatrix} \begin{pmatrix} r \\s \end{pmatrix}
}
{ =} { \begin{pmatrix} -as \\0 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
we get that \mathl{\begin{pmatrix} 1 \\0 \end{pmatrix}}{} is an
eigenvector,
and in case
\mathrelationchain
{\relationchain
{a
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the eigenspace is one-dimensional
\extrabracket {in case
\mathrelationchain
{\relationchain
{ a
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we have the identity and the eigenspace is two-dimensional} {} {.}
So in case
\mathrelationchain
{\relationchain
{a
}
{ \neq }{0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the
algebraic multiplicity
of the eigenvalue $1$ equals $2$, and the
geometric multiplicity
equals $1$.
}
\subtitle {Diagonalizable mappings}
The restriction of a linear mapping to an eigenspace is the homothety with the corresponding eigenvalue, so this is a quite simple linear mapping. If there are many eigenvalues with high-dimensional eigenspaces, then usually the linear mapping is simple in some sense. An extreme case are the so-called diagonalizable mappings.
For a diagonal matrix
\mathdisp {\begin{pmatrix} d_1 & 0 & \cdots & \cdots & 0 \\ 0 & d_2 & 0 & \cdots & 0 \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & \cdots & 0 & d_{ n-1} & 0 \\ 0 & \cdots & \cdots & 0 & d_{ n } \end{pmatrix}} { , }
the characteristic polynomial is just
\mathdisp {(X-d_1) (X-d_2) \cdots (X-d_n)} { . }
If the number $d$ occurs $k$-times as a diagonal entry, then also the linear factor \mathl{X-d}{} occurs with exponent $k$ inside the factorization of the characteristic polynomial. This is also true when we just have an upper triangular matrix. But in the case of a diagonal matrix, we can also read of immediately the eigenspaces, see
Example 27.7
.
The eigenspace for $d$ consists of all linear combinations of the standard vectors $e_i$, for which $d_i$ equals $d$. In particular, the dimension of the eigenspace equals the number how often $d$ occurs as a diagonal element. Thus, for a diagonal matrix, the algebraic and the geometric multiplicities coincide.
\inputdefinition
{ }
{
Let $K$ denote a
field,
let $V$ denote a
vector space,
and let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.
Then $\varphi$ is called \definitionword {diagonalizable}{,} if $V$ has a
basis
consisting of
eigenvectors
}
\inputfactproof
{Linear mapping/Diagonalizable/Characterizations/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote a
finite-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationthree {$\varphi$ is
diagonalizable.
} {There exists a basis $\mathfrak{ v }$ of $V$ such that the
describing matrix
\mathl{M_ \mathfrak{ v }^ \mathfrak{ v }(\varphi)}{} is a
diagonal matrix.
} {For every describing matrix
\mathrelationchain
{\relationchain
{M
}
{ = }{ M_ \mathfrak{ w }^ \mathfrak{ w }(\varphi)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
with respect to a basis $\mathfrak{ w }$, there exists an
invertible matrix
$B$ such that
\mathdisp {B M B^{-1}} { }
is a diagonal matrix.
}}
\factextra {}
}
{
The equivalence between (1) and (2) follows from the definition, from Example 27.7 , and the correspondence between linear mappings and matrices. The equivalence between (2) and (3) follows from Corollary 25.9 .
\inputfactproof
{Linear mapping/Different eigenvalues/Diagonalizable/Fact}
{Corollary}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote a
finite-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.}
\factcondition {Suppose that there exists $n$ different
eigenvalues.}
\factconclusion {Then $\varphi$ is
diagonalizable.}
\factextra {}
}
{
Because of Lemma 27.14 , there exist $n$ linearly independent eigenvectors. These form, due to Corollary 23.21 , a basis.
\inputexample{}
{
We continue with
Example 27.9
.
There exists the two
eigenvectors
\mathcor {} {\begin{pmatrix} \sqrt{5} \\1 \end{pmatrix}} {and} {\begin{pmatrix} -\sqrt{5} \\1 \end{pmatrix}} {}
for the different
eigenvalues
\mathcor {} {\sqrt{5}} {and} {- \sqrt{5}} {,}
so that the mapping is
diagonalizable,
due to
Corollary 28.10
.
With respect to the
basis
$\mathfrak{ u }$, consisting of these eigenvectors, the linear mapping is described by the diagonal matrix
\mathdisp {\begin{pmatrix} \sqrt{5} & 0 \\ 0 & - \sqrt{5} \end{pmatrix}} { . }
The
transformation matrix,
from the basis $\mathfrak{ u }$ to the standard basis $\mathfrak{ v }$, consisting of
\mathcor {} {e_1} {and} {e_2} {,}
is simply
\mathrelationchaindisplay
{\relationchain
{ M^{ \mathfrak{ u } }_{ \mathfrak{ v } }
}
{ =} { \begin{pmatrix} \sqrt{5} & - \sqrt{5} \\ 1 & 1 \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
The
inverse matrix
is
\mathrelationchaindisplay
{\relationchain
{ \frac{1}{2 \sqrt{5} } \begin{pmatrix} 1 & \sqrt{5} \\ -1 & \sqrt{5} \end{pmatrix}
}
{ =} { \begin{pmatrix} \frac{1}{2 \sqrt{5} } & \frac{1}{2} \\ \frac{-1}{2 \sqrt{5} } & \frac{1}{2} \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
Because of
Corollary 25.9
,
we have the relation
\mathrelationchainalign
{\relationchainalign
{ \begin{pmatrix} \sqrt{5} & 0 \\ 0 & - \sqrt{5} \end{pmatrix}
}
{ =} { \begin{pmatrix} \frac{1}{2 } & \frac{ \sqrt{5} }{2} \\ \frac{1}{2 } & \frac{ -\sqrt{5} }{2} \end{pmatrix} \begin{pmatrix} \sqrt{5} & - \sqrt{5} \\ 1 & 1 \end{pmatrix}
}
{ =} { \begin{pmatrix} \frac{1}{2 \sqrt{5} } & \frac{1}{2} \\ \frac{-1}{2 \sqrt{5} } & \frac{1}{2} \end{pmatrix} \begin{pmatrix} 0 & 5 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} \sqrt{5} & - \sqrt{5} \\ 1 & 1 \end{pmatrix}
}
{ } {
}
{ } {}
}
{}
{}{.}
}
\subtitle {Multiplicities and diagonalizable matrices}
\inputfaktbeweisnichtvorgefuehrt
{Endomorphism/Diagonalizable/Algebraic and geometric multiplicity/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote a
finite-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.}
\factconclusion {Then $\varphi$ is
diagonalizable
if and only if the
characteristic polynomial
$\chi_{ \varphi }$ is a product of
linear factors
and if for every zero $\lambda$ with
algebraic multiplicity
$\mu_\lambda$, the identity
\mathrelationchaindisplay
{\relationchain
{ \mu_\lambda
}
{ =} { \dim_{ K } { \left( \operatorname{Eig}_{ \lambda } { \left( \varphi \right) } \right) }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds.}
\factextra {}
}
{
If $\varphi$ is
diagonalizable,
then we can assume at once that $\varphi$ is described by a
diagonal matrix
with respect to a basis of eigenvectors. The diagonal entries of this matrix are the eigenvalues, and these occur as often as their
geometric multiplicity
tells us. The
characteristic polynomial
can be read off directly from the diagonal matrix, every diagonal entry $\lambda$ constitutes a linear factor \mathl{X- \lambda}{.}
For the other direction, let \mathl{\lambda_1 , \ldots , \lambda_k}{} denote the different eigenvalues, and let
\mathrelationchaindisplay
{\relationchain
{ \mu_i
}
{ \defeq} { \mu_{\lambda_i}(\varphi)
}
{ =} { \dim_{ K } { \left( \operatorname{Eig}_{ \lambda_i } { \left( \varphi \right) } \right) }
}
{ } {
}
{ } {
}
}
{}{}{}
denote the
\extrabracket {geometric and algebraic} {} {}
multiplicities. Due to the condition, the characteristic polynomial factors in linear factors. Therefore, the sum of these numbers equals
\mathrelationchain
{\relationchain
{ n
}
{ = }{ \dim_{ K } { \left( V \right) }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Because of
Fact *****,
the sum of the eigenspaces
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Eig}_{ \lambda_1 } { \left( \varphi \right) } \oplus \cdots \oplus \operatorname{Eig}_{ \lambda_k } { \left( \varphi \right) }
}
{ \subseteq} { V
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is direct. By the condition, the dimension on the left is also $n$, so that we have equality. Due to
Fact *****,
$\varphi$ is diagonalizable.
The product of two diagonal matrices is again a diagonal matrix. The following example shows that the product of two diagonalizable matrices is in general not diagonalizable.
\inputexample{}
{
Let
\mathcor {} {G_1} {and} {G_2} {}
denote two lines in $\R^2$ through the origin, and let
\mathcor {} {\varphi_1} {and} {\varphi_2} {}
denote the reflections at these axes. A reflection at an axis is always
diagonalizable,
the axis and the line orthogonal to the axis are eigenlines
\extrabracket {with eigenvalues $1$ and $-1$} {} {.}
The
composition
\mathrelationchaindisplay
{\relationchain
{ \psi
}
{ =} { \varphi_2 \circ \varphi_1
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
of the reflections is a
plane rotation,
the angle of rotation being twice the angle between the two lines. However, a rotation is only diagonalizable if the angle of rotation is
\mathcor {} {0} {or} {180} {}
degree. If the angle between the axes is different from \mathl{0,90}{} degree, then $\psi$ does not have any eigenvector.
}
\subtitle {Trigonalizable mappings}
\inputdefinition
{ }
{
Let $K$ denote a field, and let $V$ denote a finite-dimensional vector space. A linear mapping $\varphi \colon V \rightarrow V$ is called \definitionword {trigonalizable}{,} if there exists a basis such that the describing matrix of $\varphi$ with respect to this basis is an
upper triangular matrix.}
Diagonalizable linear mappings are in particular trigonalizable. The reverse statement is not true, as Example 28.7 shows.
\inputfaktbeweisnichtvorgefuehrt
{Linear mapping/Trigonalizable/Characterization with characteristic polynomial/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a
field,
and let $V$ denote a
finite-dimensional
vector space.
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
denote a
linear mapping.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationtwo {$\varphi$ is
trigonalizable.
} {The
characteristic polynomial
$\chi_{ \varphi }$ has a factorization into
linear factors.
}}
\factextra {If $\varphi$ is trigonalizable and is described by the matrix $M$ with respect to some basis, then there exists an invertible matrix
\mathrelationchain
{\relationchain
{B
}
{ \in }{ \operatorname{Mat}_{ n \times n } (K)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
such that \mathl{BMB^{-1}}{} is an
upper triangular matrix.}
}
{Linear mapping/Trigonalizable/Characterization with characteristic polynomial/Fact/Proof
\inputfactproof
{Square matrix/C/Trigonalizable/Fact}
{Theorem}
{}
{
\factsituation {Let
\mathrelationchain
{\relationchain
{M
}
{ \in }{\operatorname{Mat}_{ n \times n } (\Complex)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
denote a square matrix with
complex
entries.}
\factconclusion {Then $M$ is
trigonalizable.}
\factextra {}
}
{
This follows from Theorem 28.15 and the Fundamental theorem of algebra.