Linear algebra (Osnabrück 2024-2025)/Part I/Lecture 12/refcontrol
- Invertible matrices
Let be a field,MDLD/field and let denote an -matrixMDLD/matrix over . Then is called invertible, if there exists a matrix such that
Let denote a field.MDLD/field For an invertible matrixMDLD/invertible matrix , the matrix fulfilling
is called the inverse matrix of . It is denoted by
The product of invertible matrices is again invertible.
For a fieldMDLD/field and , the set of all invertibleMDLD/invertible (matrix) -matricesMDLD/matrices with entries in is called the general linear group
over . It is denoted by .Two square matrices are called similar, if there exists an invertible matrixMDLD/invertible matrix with
.For a linear mapping , the describing matrices with respect to two bases are similar to each other, due to Corollary 11.12 .
- Properties of linear mappings
Let be a field, and let and be vector spaces over of dimensions and . Let
be a linear map, described by the matrix
with respect to two bases. Then the following properties hold.- is injectiveMDLD/injective if and only if the columns of the matrix are linearly independent.MDLD/linearly independent
- is surjectiveMDLD/surjective if and only if the columns of the matrix form a generating systemMDLD/generating system (vs) of .
- Let . Then is bijectiveMDLD/bijective if and only if the columns of the matrix form a basisMDLD/basis (vs) of , and this holds if and only if is invertible.MDLD/invertible (matrix)
Let and denote the bases of and respectively, and let denote the column vectors of . (1). The mapping has the property
where is the -th entry of the -th column vector. Therefore,
This is if and only if for all , and this is equivalent with
For this vector equation, there exists a nontrivial tuple , if and only if the columns are linearly dependent, and this holds if and only if is not injective.
(2). See
Exercise 12.5
.
(3). Let
.
The first equivalence follows from (1) and (2). If is bijective, then there exists a
(linear)
inverse mappingMDLD/inverse mapping
with
Let denote the matrix for , and the matrix for . The matrix for the identity is the identity matrix.MDLD/identity matrix Because of Lemma 11.10 , we have
and therefore is invertible. The reverse implication is proved similarly.
- Elementary matrices
Let be a field,MDLD/field and let be an -matrixMDLD/matrix over . Then the following manipulations on are called elementary row operations.
- Transpositions of two rows.
- Multiplication of a row with a scalar .
- Addition of times a row to another row.
Let be a field.MDLD/field We denote by the -matrixMDLD/matrix with entry at the position , and entry everywhere else. Then the following matrices are called elementary matrices.
- .
- .
- .
In detail, these elementary matrices look as follows.
Elementary matrices are invertible, see Exercise 12.1 .
Let be a fieldMDLD/field and a -matrix with entries in . Then the multiplication by elementary matrices from the left with has the following effects.
- exchange of the -th and the -th row of .
- multiplication of the -th row of by .
- addition of -times the -th row of to the -th row ().
Proof
Elementary row operations do not change the solution space of a homogeneous linear system, as shown in
Lemma 5.3
.
Let be a field,MDLD/field and let denote an -matrixMDLD/matrix over . Then there exist elementary row operations,MDLD/elementary row operations and a (new) numbering of the columns
and an such that, in the new matrix, the columns have the form
and
By further elementary row operations, and by swapping of columns, the matrix can be brought to the form
with
.
Let be a field,MDLD/field and let denote an invertibleMDLD/invertible (matrix) -matrixMDLD/matrix over . Then there exist elementary row operationsMDLD/elementary row operations such that, after these manipulations, a matrix of the form
with arises. By further elementary row operation, one can also obtain the
identity matrix.MDLD/identity matrixThis rests on the manipulations of the elimination procedure, and on the fact that elementary row manipulations are achieved, due to Lemma 12.18 , by multiplications with elementary matrices from the left. In doing this, it can not happen that a zero-column or a zero-row arises, because the elementary matrices are invertible, and, in each step, invertibility is preserved. If we have an upper triangular matrix, then the diagonal entries are not , and, by multiplication with a scalar, we can normalize them to . With this, we can further achieve, in every column, that all entries above the diagonal entry are .
In particular, for an invertible matrix , there are elementary matrices such that
is the identity matrix.
- Finding the inverse matrix
Let denote a square matrix.MDLD/square matrix How can we decide whether the matrix is invertible,MDLD/invertible (matrix) and how can we find the inverse matrixMDLD/inverse matrix ?
For this we write down a table, on the left-hand side we write down the matrix , and on the right-hand side we write down the identity matrix (of the right size). Now we apply on both sides step by step the same elementary row manipulations. The goal is to produce in the left-hand column, starting with the matrix, in the end the identity matrix. This is possible if and only if the matrix is invertible. We claim that we produce, by this method, in the right column the matrix in the end. This rests on the following invariance principle. Every elementary row manipulation can be realized as a matrix multiplication with some elementary matrixMDLD/elementary matrix from the left. If in the table we have somewhere the pair
after the next step (in the next line) we have
If we multiply the inverse of the second matrix (which we do not know yet; however, we do know its existence, in case the matrix is invertible) with the first matrix, then we get
This means that this expression is not changed in each single step. In the beginning, this expression equals , hence in the end, the pair must fulfil
We want to find for the matrix its inverse matrixMDLD/inverse matrix , following Method 12.11 .
- Rank of matrices
Let be a field,MDLD/field and let denote an -matrixMDLD/matrix over . Then the dimensionMDLD/dimension (fgvs) of the linear subspaceMDLD/linear subspace of , generated by the columns, is called the column rank of the matrix, written
Let denote a field,MDLD/field and let and denote -vector spacesMDLD/vector spaces of dimensions and . Let
be a linear mapping,MDLD/linear mapping which is described by the matrixMDLD/matrix , with respect to basesMDLD/bases (vs) of the spaces. Then
Proof
To formulate the next statement, we introduce row rank of an -matrix to be the dimension of the linear subspace of generated by the rows.
Let be a field,MDLD/field and let denote an -matrixMDLD/matrix over . Then the column rankMDLD/column rank coincides with the row rank.MDLD/row rank The rank equals the number from
Theorem 12.9 .In an elementary row manipulation,MDLD/elementary row manipulation the linear subspace generated by the rows is not changed, therefore the row rank is not changed. The row rank of equals the row rank of the matrix in echelon form obtained in Theorem 12.9 . This matrix has row rank , since the first rows are linearly independent,MDLD/linearly independent and, apart from this, this, there are only zero rows. It has also column rank , since the columns, where there is a new step, are linearly independent, and the other columns are linear combinationsMDLD/linear combinations of these columns. By Exercise 12.18 , the column rank is preserved by elementary row manipulations.
Both ranks coincide, so we only talk about the rank of a matrix.
Let be a field,MDLD/field and let denote an -matrixMDLD/matrix
over . Then the following statements are equivalent.- is invertible.
- The rankMDLD/rank (matrix) of is .
- The rows of are linearly independent.MDLD/linearly independent
- The columns of are linearly independent.
<< | Linear algebra (Osnabrück 2024-2025)/Part I | >> PDF-version of this lecture Exercise sheet for this lecture (PDF) |
---|