Jump to content

Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 23

From Wikiversity



The characteristic polynomial

We want to determine, for a given endomorphism , the eigenvalues and the eigenspaces. For this, the characteristic polynomial is decisive.


For an -matrix with entries in a field , the polynomial

is called the characteristic polynomial[1]

of .

For , this means

In this definition, we use the determinant of a matrix, which we have only defined for matrices with entries in a field. The entries are now elements of the polynomial ring . But, since we can consider these elements also inside the field of rational functions ,[2] this is a useful definition. By definition, the determinant is an element in , but, because all entries of the matrix are polynomials, and because in the recursive definition of the determinant, only addition and multiplication is used, the characteristic polynomial is indeed a polynomial. The degree of the characteristic polynomial is , and its leading coefficient is , so it has the form

We have the important relation

for every , see Exercise 23.3 . Here, on the left-hand side, the number is inserted into the polynomial, and on the right-hand side, we have the determinant of a matrix which depends on .

For a linear mapping

on a finite-dimensional vector space, the characteristic polynomial is defined by

where is a describing matrix with respect to some basis. The multiplication theorem for the determinant shows that this definition is independent of the choice of the basis, see Exercise 23.24 .

The characteristic polynomial of the identity on an -dimensional vector space is


Let denote a field, and let denote an -dimensional vector space. Let

denote a linear mapping. Then is an eigenvalue of if and only if is a zero of the characteristic polynomial

.

Let denote a describing matrix for , and let be given. We have

if and only if the linear mapping

is not bijective (and not injective) (due to Theorem 16.11 and Lemma 12.5 ). This is, because of Lemma 22.1 and Lemma 11.4 , equivalent with

and this means that the eigenspace for is not the null space, thus is an eigenvalue for .



We consider the real matrix . The characteristic polynomial is

The eigenvalues are therefore (we have found these eigenvalues already in Example 21.5 , without using the characteristic polynomial).


For the matrix

the characteristic polynomial is

Finding the zeroes of this polynomial leads to the condition

which has no solution over , so that the matrix has no eigenvalues over . However, considered over the complex numbers , we have the two eigenvalues and . For the eigenspace for , we have to determine

a basis vector (hence an eigenvector) of this is . Analogously, we get


For an upper triangular matrix

the characteristic polynomial is

due to Lemma 16.4 . In this case, we have directly a factorization of the characteristic polynomial into linear factors, so that we can see immediately the zeroes and the eigenvalues of , namely just the diagonal elements (which might not be all different).



Invariant linear subspaces

Let be a field, a vector space over and

a linear mapping. A linear subspace is called -invariant, if

holds.

The zero-space and the total space are -invariant. Moreover, the eigenspaces for are invariant.


Let be a finite-dimensional -vector space, and

be a linear mapping. Let

denote a direct sum decomposition in -invariant linear subspaces. Then the characteristic polynomial fulfills the relation

Let be a basis of and be a basis of ; together they form a basis of . With respect to this basis, is described by the block matrix , where describes the restriction and describes the restriction . Then, using Exercise 16.23 , we get




Algebraic multiplicity

For a more detailed investigation of the eigenspaces, the following concept is useful.


Let

be a linear mapping on a finite-dimensional -vector space , and . The exponent of the linear polynomial in the characteristic polynomial is called the algebraic multiplicity of . It is denoted by

Recall that

is called the geometric multiplicity of . We know, due to Theorem 23.2 , that one of these multiplicities is positive if and only if this is true for the other multiplicity, and this is the case if and only if is an eigenvalue.

In general, the multiplicities can be different, we have, however, an estimate between them.


Let denote a field, and let denote a finite-dimensional vector space. Let

denote a linear mapping and . Then we have the estimate

between the geometric and the

algebraic multiplicity.

Let and let be a basis of this eigenspace. We complement this basis with to get a basis of , using Theorem 8.10 . With respect to this basis, the describing matrix has the form

Ttherefore, the characteristic polynomial equals (using Exercise 16.23 ) , so that the algebraic multiplicity is at least .



We consider the -shearing matrix

with . The characteristic polynomial is

so that is the only eigenvalue of . The corresponding eigenspace is

From

we get that is an eigenvector, and in case , the eigenspace is one-dimensional (in case , we have the identity and the eigenspace is two-dimensional). So in case , the algebraic multiplicity of the eigenvalue equals , and the geometric multiplicity equals .



Multiplicities and diagonalizable mappings

Let denote a field, and let denote a finite-dimensional vector space. Let

denote a linear mapping. Then is diagonalizable if and only if the characteristic polynomial is a product of linear factors and if for every zero with algebraic multiplicity , the identity

holds.

If is diagonalizable, then we can assume at once that is described by a diagonal matrix with respect to a basis of eigenvectors. The diagonal entries of this matrix are the eigenvalues, and these occur as often as their geometric multiplicity tells us. The characteristic polynomial can be read off directly from the diagonal matrix, every diagonal entry constitutes a linear factor .

For the other direction, let denote the different eigenvalues, and let

denote the (geometric and algebraic) multiplicities. Due to the condition, the characteristic polynomial factors in linear factors. Therefore, the sum of these numbers equals . Because of Lemma 22.6 , the sum of the eigenspaces

is direct. By the condition, the dimension on the left is also , so that we have equality. Due to Lemma 22.11 , is diagonalizable.



Let denote a field, and let denote a -vector space of finite dimension. Let

be a linear mapping. Suppose that the characteristic polynomial factors into different linear factors. Then is

diagonalizable.

Proof


This gives also a new proof for Corollary 22.10 .



Footnotes
  1. Some authors define the characteristic polynomial as the determinant of , instead of . This does only change the sign.
  2. is called the field of rational polynomials; it consists of all fractions for polynomials with . For or , this field can be identified with the field of rational functions.





<< | Linear algebra (Osnabrück 2024-2025)/Part I | >>
PDF-version of this lecture
Exercise sheet for this lecture (PDF)