Eigenvalues and eigenvectors/Introduction/Section
For a reflection at an axis in the plane, certain vectors behave particularly simply. The vectors on the axis are sent to themselves, and the vectors which are orthogonal to the axis are sent to their negatives. For all these vectors, the image under this linear mapping lies on the line spanned by these vectors. In the theory of eigenvalues and eigenvectors, we want to know whether, for a given linear mapping, there exist lines (one-dimensional linear subspaces), which are mapped to themselves. The goal is to find, for the linear mapping, a basis such that the describing matrix is quite simple. Here, an important application is to find solutions for a system of linear differential equations.
Let be a field, a -vector space and
a linear mapping. Then an element , , is called an eigenvector of (for the eigenvalue ), if
for some
holds.Let be a field, a -vector space and
a linear mapping. Then an element is called an eigenvalue for , if there exists a vector , such that
Let be a field, a -vector space and
a linear mapping. For , we denote by
Thus we allow arbitrary values (not only eigenvalues) in the definition of an eigenspace. We will see in fact that they are linear subspaces. In particular, belongs to every eigenspace, though it is never an eigenvector. The linear subspace generated by an eigenvector is called an eigenline. For most (in fact all up to finitely many, in case the vector space has finite dimension) , the eigenspace is just the zero space.
We consider some easy examples over .
A linear mapping from to is the multiplication with a fixed number (the proportionality factor). Therefore, every number is an eigenvector for the eigenvalue , and the eigenspace for this eigenvalue is the whole . Beside , there are no other eigenvalues, and all eigenspaces for are .
A linear mapping from to is described by a -matrix with respect to the standard basis. We consider the eigenvalues for some elementary examples. A homothety is given as , with a scaling factor . Every vector is an eigenvector for the eigenvalue , and the eigenspace for this eigenvalue is the whole . Beside , there are no other eigenvalues, and all eigenspaces for are . The identity only has the eigenvalue .
The reflection at the -axis is described by the matrix . The eigenspace for the eigenvalue is the -axis, the eigenspace for the eigenvalue is the -axis. A vector with is not an eigenvector, since the equation
does not have a solution.
A plane rotation is described by a rotation matrix for the rotation angle , For , this is the identity, for , this is a half rotation, which is the reflection at the origin or the homothety with factor . For all other rotation angles, there is no line sent to itself, so that these rotations have no eigenvalue and no eigenvector (and all eigenspaces are ).
Let be a field, a -vector space and
a
linear mapping. Then the following statements hold.- Every
eigenspace
is a linear subspace of .
- is an eigenvalue for , if and only if the eigenspace is not the null space.
- A vector , is an eigenvector for , if and only if .
Proof
For matrices, we use the same concepts. If
is a linear mapping, and is a describing matrix with respect to a basis, then for an eigenvalue and an eigenvector
with corresponding coordinate tuple with respect to the basis, we have the relation
The describing matrix with respect to another basis satisfies, due to to fact, the relation , where is an invertible matrix. Let
denote the coordinate tuple with respect to the second basis. Then
i.e., the describing matrices have the same eigenvalues, but the coordinate tuples for the eigenvectors are different.
We consider the linear mapping
given by the diagonal matrix
The diagonal entries are the eigenvalues of , and the -th standard vector is a corresponding eigenvector. The eigenspaces are
These spaces are not if and only if equals one of the diagonal entries. The dimension of the eigenspace is given by the number how often the value occurs in the diagonal. The sum of all these dimension gives .
For an orthogonal reflection of , there exists an -dimensional linear subspace , which is fixed by the mapping and every vector orthogonal to is sent to its negative. If is a basis of and is a vector orthogonal to , then the reflection is described by the matrix
with respect to this basis.
We consider the linear mapping
given by the matrix
The question whether this mapping has eigenvalues, leads to the question whether there exists some , such that the equation
has a nontrivial solution . For a given , this is a linear problem and can be solved with the elimination algorithm. However, the question whether there exist eigenvalues at all, leads, due to the variable "eigenvalue parameter“ , to a nonlinear problem. The system of equations above is
For , we get , but the null vector is not an eigenvector. Hence, suppose that . Both equations combined yield the condition
hence . But in , the number does not have a square root, therefore there is no solution, and that means that has no eigenvalues and no eigenvectors.
Now we consider the matrix as a real matrix, and look at the corresponding mapping
The same computations as above lead to the condition , and within the real numbers, we have the two solutions
For both values, we have now to find the eigenvectors. First, we consider the case , which yields the linear system
We write this as
and as
This system can be solved easily, the solution space has dimension one, and
is a basic solution.
For , we do the same steps, and the vector
is a basic solution. Thus over , the numbers and are eigenvalues, and the corresponding eigenspaces are