Jump to content

Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 11

From Wikiversity



Linear subspace under a linear mapping

A typical property of a linear mapping is that it maps lines to lines (or to a point). More general is following statement.


Let be a field, let and denote -vector spaces and let

be a

-linear mapping. Then the following hold.
  1. For a linear subspace , the image is a linear subspace of .
  2. In particular, the image of the mapping is a linear subspace of .
  3. For a linear subspace , the preimage is a linear subspace of .
  4. In particular, is a linear subspace of .

Proof



Let denote a field, let and denote -vector spaces, and let

denote a -linear mapping. Then

is called the kernel of .

Due to the statement above, the kernel is a linear subspace of .


For an -matrix , the kernel of the linear mapping

given by is just the solution space of the homogeneous linear system

The following criterion for injectivity is important.


Let denote a field, let and denote -vector spaces, and let

denote a -linear mapping. Then is injective if and only if

holds.

If the mapping is injective, then there can exist, apart from , no other vector with . Hence, .
So suppose that , and let be given with . Then, due to linearity,

Therefore, , and so .



The dimension formula

The following statement is called dimension formula.


Let denote a field, let and denote -vector spaces, and let

denote a -linear mapping. Suppose that has finite dimension. Then

holds.

Set . Let denote the kernel of the mapping and let denote its dimension (). Let

be a basis of . Due to Theorem 8.10 , there exist vectors

such that

is a basis of . We claim that

is a basis of the image. Let be an element of the image . Then there exists a vector such that . We can write with the basis as

Then we have

which means that is a linear combination in terms of the . In order to prove that the family , , is linearly independent, let a representation of zero be given,

Then

Therefore, belongs to the kernel of the mapping. Hence, we can write

Since this is altogether a basis of , we can infer that all coefficients are , in particular, .



Let denote a field, let and denote -vector spaces, and let

denote a -linear mapping. Suppose that has finite dimension. Then we call

the rank of .

The dimension formula can also be expressed as


Let be a linear mapping, where has finite dimension. The dimension formula can be illustrated with the following special cases. If is the zero mapping, then and

If is injective, then and

The rank is always between and the dimension of the source space . If is surjective, then

and


We consider the linear mapping

given by the matrix

To determine the kernel, we have to solve the homogeneous linear system

The solution space is

and this is the kernel of . The kernel has dimension one, therefore the dimension of the image is , due to the dimension formula.


Let denote a field, let and denote -vector spaces with the same dimension . Let

denote a linear mapping. Then is injective if and only if is

surjective.

This follows from the dimension formula and Lemma 11.4 .



Composition of linear mappings and matrices

In the last lecture we have discussed the correspondence between linear mappings and matrices under the condition that bases are fixed. This correspondence respects also compositions of mappings and matrix multiplication, as the following lemma shows.


In the correspondence between linear mappings and matrices, the composition of linear mappings corresponds to the matrix multiplication. More precisely: let denote vector spaces over a field with bases

Let

denote linear mappings. Then, for the describing matrix of , and of the composition , the relation

holds.

We consider the commutative diagram

where the commutativity rests on the identities

from Lemma 10.14 . The (inverse) coordinate mappings are bijective. Therefore, we have

Hence, we get altogether

where we have everywhere compositions of mappings. Due to Exercise 10.20 , the composition of mappings corresponds to the matrix multiplication.

This implies immediately that the multiplication of matrices is associative.



Linear mappings and base change

Let denote a field, and let and denote finite-dimensional -vector spaces. Let and be bases of and and bases of . Let

denote a linear mapping, which is described by the matrix with respect to the bases and . Then is described with respect to the bases and by the matrix

where and are the transformation matrices, which describe the change of basis from to and from

to .

The linear standard mappings and for the various bases are denoted by . We consider the commutative diagram

where the commutativity rests on Lemma 9.1 and Lemma 10.14 . In this situation, we have altogether



Let denote a field, and let denote a -vector space of finite dimension. Let

be a linear mapping. Let and denote bases of . Then the matrices that describe the linear mapping with respect to and respectively (on both sides), fulfil the relation

This follows directly from Lemma 11.11 .


It is an important goal of linear algebra to find, for a given linear mapping , a basis such that the describing matrix becomes "quite simple“.


<< | Linear algebra (Osnabrück 2024-2025)/Part I | >>
PDF-version of this lecture
Exercise sheet for this lecture (PDF)