Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 26/latex
\setcounter{section}{26}
\subtitle {Rank of matrices}
\inputdefinition
{ }
{
Let $K$ be a field, and let $M$ denote an $m \times n$-matrix over $K$. Then the dimension of the linear subspace of $K^m$, generated by the columns, is called the \definitionword {column rank}{} of the matrix, written
\mathdisp {\operatorname{rk} \, M} { . }
}
\inputfactproof
{Linear mapping/Matrix and basis/Rank/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ denote a
field,
and let
\mathcor {} {V} {and} {W} {}
denote
$K$-vector spaces
of dimensions
\mathcor {} {n} {and} {m} {.}
Let
\mathdisp {\varphi \colon V \longrightarrow W} { }
be a
linear mapping,
which is described by the
matrix
\mathrelationchain
{\relationchain
{ M
}
{ \in }{ \operatorname{Mat}_{ m \times n } (K)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
with respect to
bases
of the spaces.}
\factconclusion {Then
\mathrelationchaindisplay
{\relationchain
{ \operatorname{rk} \, \varphi
}
{ =} { \operatorname{rk} \, M
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds.}
\factextra {}
{See Exercise 26.22 .}
To formulate the next statement, we introduce \keyword {row rank} {} of an \mathl{m \times n}{-}matrix to be the dimension of the linear subspace of $K^n$ generated by the rows.
\inputfactproof
{Matrix/Row rank and column rank/Manipulations/Fact}
{Lemma}
{}
{
\factsituation {Let $K$ be a
field,
and let $M$ denote an
$m \times n$-matrix
over $K$.}
\factconclusion {Then the
column rank
coincides with the
row rank.}
\factextra {If $M$ is transformed with elementary row manipulations to a matrix $M'$ in the sense of
Theorem 21.9
,
then the rank equals the number of relevant rows of $M'$.}
}
{
Let $r$ denote the number of the relevant rows in the matrix $M'$ in echelon form, gained by elementary row manipulations. We have to show that this number is the column rank, and the row rank of $M'$ and of $M$. In an elementary row manipulation, the linear subspace generated by the rows is not changed, therefore the row rank is not changed. So the row rank of $M$ equals the row rank of $M'$. This matrix has row rank $r$, since the first $r$ rows are linearly independent, and beside this, there are only zero rows. But $M'$ has also column rank $r$, since the $r$ columns, where there is a new step, are linearly independent, and the other columns are linear combinations of these $r$ columns. By Exercise 26.2 , the column rank is preserved by elementary row manipulations.
Both ranks coincide, so we only talk about the \keyword {rank of a matrix} {.}
\inputfactproof
{Quadratic matrix/Rank/Invertible/Linearly independent/Fact}
{Corollary}
{}
{
\factsituation {Let $K$ be a
field,
and let $M$ denote an
$n \times n$-matrix
over $K$.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationfour {$M$ is invertible.
} {The
rank
of $M$ is $n$.
} {The rows of $M$ are
linearly independent.
} {The columns of $M$ are linearly independent.
}}
\factextra {}
}
{
The equivalence of (2), (3) and (4) follows from the definition and from
Lemma 26.3
.
For the equivalence of (1) and (2), let's consider the
linear mapping
\mathdisp {\varphi \colon K^n \longrightarrow K^n} { }
defined by $M$. The property that the column rank equals $n$, is equivalent with the map being surjective, and this is, due to
Corollary 25.4
,
equivalent with the map being bijective. Because of
Lemma 25.11
,
bijectivity is equivalent with the matrix being
invertible.
\subtitle {Determinants}
\inputdefinition
{ }
{
Let $K$ be a
field,
and let $M$ denote an
$n \times n$-matrix
over $K$ with entries $a_{ij}$. For
\mathrelationchain
{\relationchain
{i
}
{ \in }{ \{ 1 , \ldots , n \}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
let $M_i$ denote the \mathl{(n-1)\times (n-1)}{-}matrix, which arises from $M$, when we remove the first column and the $i$-th row. Then one defines recursively the \definitionword {determinant}{} of $M$ by
\mathrelationchaindisplay
{\relationchain
{ \det M
}
{ =} { \begin{cases} a_{11}\, , & \text{ for } n = 1 \, , \\ \sum_{i =1}^n(-1)^{i+1} a_{i1} \det M_i & \text{ for } n \geq 2 \, . \end{cases}
}
{ } {
}
{ } {
}
{ } {
}
}
}
The determinant is only defined for square matrices. For small $n$, the determinant can be computed easily.
\inputexample{}
{
For a
$2\times 2$-matrix
\mathrelationchaindisplay
{\relationchain
{M
}
{ =} { \begin{pmatrix} a & b \\ c & d \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
we have
\mathdisp {\det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = a d - c b} { . }
}
\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {Sarrus_rule.png} }
\end{center}
\imagetext {For a $3\times 3$-matrix, we can use the \keyword {rule of Sarrus} {} to compute the determinant. We repeat the first column as the fourth column and the second column as the fifth column. The products of diagonals from up to down enter with a positive sign, and the products of the other diagonals enter with a negative sign.} }
\imagelicense { Sarrus rule.png } {} {Kmhkmh} {Commons} {CC-by-sa 3.0} {}
\inputexample{}
{
For a
$3 \times 3$-matrix
\mathrelationchain
{\relationchain
{M
}
{ = }{ \begin{pmatrix} a_{ 1 1 } & a_{ 1 2 } & a_{ 1 3 } \\ a_{ 2 1 } & a_{ 2 2 } & a_{ 2 3 } \\ a_{ 3 1 } & a_{ 32 } & a_{ 33 } \end{pmatrix}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we have
\mathdisp {\det \begin{pmatrix} a_{ 1 1 } & a_{ 1 2 } & a_{ 1 3 } \\ a_{ 2 1 } & a_{ 2 2 } & a_{ 2 3 } \\ a_{ 3 1 } & a_{ 32 } & a_{ 33 } \end{pmatrix} = a_{1 1 } a_{2 2 } a_{3 3 } + a_{1 2 } a_{2 3 } a_{3 1 } + a_{1 3 } a_{2 1 } a_{3 2 } - a_{1 3 } a_{2 2 } a_{3 1 } - a_{1 1 } a_{2 3 } a_{3 2 } - a_{1 2 } a_{2 1 } a_{3 3 }} { . }
This is called the \keyword {rule of Sarrus} {.}
}
\inputfactproof
{Determinant/Field/Upper triangular matrix/Fact}
{Lemma}
{}
{
\factsituation {}
\factcondition {For an
upper triangular matrix
\mathrelationchaindisplay
{\relationchain
{ M
}
{ =} { \begin{pmatrix} b_1 & \ast & \cdots & \cdots & \ast \\ 0 & b_2 & \ast & \cdots & \ast \\ \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & \cdots & 0 & b_{ n-1} & \ast \\ 0 & \cdots & \cdots & 0 & b_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}}
\factconclusion {we have
\mathrelationchaindisplay
{\relationchain
{ \det M
}
{ =} { b_1 b_2 \cdots b_{n-1} b_n
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {In particular, for the
identity matrix
we get
\mathrelationchain
{\relationchain
{ \det E_{ n }
}
{ = }{ 1
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
}
{
This follows with a simple induction directly from the recursive definition of the determinant.
\subtitle {Multilinearity}
We want to show that the recursively defined determinant is a \quotationshort{multilinear}{} and \quotationshort{alternating}{} mapping, where we identify
\mathrelationchaindisplay
{\relationchain
{ \operatorname{Mat}_{ n } (K)
}
{ \cong} { (K^n)^n
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{,}
so a matrix is identified with the $n$-tuple of the rows of the matrix. We consider a matrix as a tuple of columns
\mathdisp {\begin{pmatrix} v_{1 } \\ \vdots\\ v_{ n } \end{pmatrix}} { }
where the entries $v_i$ are row vectors of length $n$.
\inputfaktbeweisnichtvorgefuehrt
{Determinant/Recursively/Multilinear/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ be a field, and
\mathrelationchain
{\relationchain
{n
}
{ \in }{\N_+
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factconclusion {Then the
determinant
\mathdisp {\operatorname{Mat}_{ n } (K) = (K^n)^n \longrightarrow K
, M \longmapsto \det M} { , }
is
multilinear.}
\factextra {This means that for every
\mathrelationchain
{\relationchain
{k
}
{ \in }{\{ 1 , \ldots , n \}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
and for every choice of \mathl{n-1}{} vectors
\mathrelationchain
{\relationchain
{ v_1 , \ldots , v_{k-1} , v_{k+1} , \ldots , v_n
}
{ \in }{ K^n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
and for any
\mathrelationchain
{\relationchain
{u,w
}
{ \in }{ K^n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the identity
\mathrelationchaindisplay
{\relationchain
{ \det \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\u+w\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix}
}
{ =} { \det \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\u\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix} + \det \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\w\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds, and for
\mathrelationchain
{\relationchain
{ s
}
{ \in }{ K
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the identity
\mathrelationchaindisplay
{\relationchain
{ \det \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\s u\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix}
}
{ =} { s \det \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\u\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds.}
}
{
Let
\mathdisp {M \defeq \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\u\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix} \, ,M' \defeq \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\w\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix} \text{ and } \tilde{M} \defeq \begin{pmatrix} v_1 \\\vdots\\ v _{ k -1 }\\u+w\\ v_{ k +1 }\\ \vdots\\ v_{ n } \end{pmatrix}} { , }
where we denote the entries and the matrices arising from deleting a row in an analogous way. In particular,
\mathrelationchain
{\relationchain
{ u
}
{ = }{ \left( a_{k1} , \, \ldots , \, a_{kn} \right)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
and
\mathrelationchain
{\relationchain
{ w
}
{ = }{ \left( a_{k1}' , \, \ldots , \, a_{kn}' \right)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.} We prove the statement by induction over $n$, For
\mathrelationchain
{\relationchain
{i
}
{ \neq }{k
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we have
\mathrelationchain
{\relationchain
{ \tilde{a}_{i 1}
}
{ = }{ a_{i1}
}
{ = }{ a'_{i1}
}
{ }{
}
{ }{
}
}
{}{}{}
and
\mathrelationchaindisplay
{\relationchain
{ \det \tilde{M}_i
}
{ =} { \det M_i + \det M'_i
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
due to the induction hypothesis. For
\mathrelationchain
{\relationchain
{i
}
{ = }{k
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
we have
\mathrelationchain
{\relationchain
{ M_k
}
{ = }{ M_k'
}
{ = }{ \tilde{M}_k
}
{ }{
}
{ }{
}
}
{}{}{}
and
\mathrelationchain
{\relationchain
{ \tilde{a}_{k 1}
}
{ = }{ a_{k1} + a'_{k1}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Altogether, we get
\mathrelationchainalignhandleft
{\relationchainalignhandleft
{ \det \tilde{M}
}
{ =} { \sum_{i = 1}^n (-1)^{i+1} \tilde{a}_{i1} \det \tilde{M}_i
}
{ =} { \sum_{i = 1,\, i \neq k }^n (-1)^{i+1} a_{i1} ( \det {M}_i + \det {M}'_i )
+ (-1)^{k+1} ( a_{k1} + a'_{k1} )( \det \tilde{M}_k )
}
{ =} { \sum_{i = 1,\, i \neq k }^n (-1)^{i+1} a_{i1} \det {M}_i + \sum_{i = 1,\, i \neq k }^n (-1)^{i+1} a_{i1} \det {M}'_i + (-1)^{k+1} a_{k1} \det M_k + (-1)^{k+1} a'_{k1} \det M_k
}
{ =} { \sum_{i = 1 }^n (-1)^{i+1} a_{i1} \det {M}_i + \sum_{ i = 1,\, i \neq k, \,}^n (-1)^{i+1} a_{i1} \det {M}'_i
+ (-1)^{k+1} a'_{k1} \det M_k
}
}
{
\relationchainextensionalign
{ =} { \sum_{i = 1 }^n (-1)^{i+1} a_{i1} \det {M}_i + \sum_{ i = 1 }^n (-1)^{i+1} a'_{i1} \det {M}'_i
}
{ =} { \det M + \det M'
}
{ } {}
{ } {}
}
{}{.}
The compatibility with the scalar multiplication is proved in a similar way, see
exercise *****.
\inputfaktbeweisnichtvorgefuehrt
{Determinant/Recursively/Alternating/Swap property/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ be a
field,
and
\mathrelationchain
{\relationchain
{n
}
{ \in }{\N_+
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factsegue {Then the
determinant
\mathdisp {\operatorname{Mat}_{ n } (K) = (K^n)^n \longrightarrow K
, M \longmapsto \det M} { , }
has the following properties.}
\factconclusion {\enumerationtwo {If in $M$ two rows are identical, then
\mathrelationchain
{\relationchain
{ \det M
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
This means that the determinant is
alternating.
} {If we exchange two rows in $M$, then the determinant changes with factor $-1$.
}}
\factextra {}
}
{Determinant/Recursively/Alternating/Swap property/Fact/Proof
\inputfactproof
{Determinant/Zero, linear dependent and rank property/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ be a
field,
and let $M$ denote an
$n \times n$-matrix
over $K$.}
\factsegue {Then the following statements are equivalent.}
\factconclusion {\enumerationfour {We have
\mathrelationchain
{\relationchain
{ \det M
}
{ \neq }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
} {The rows of $M$ are
linearly independent.
} {$M$ is
invertible.
} {We have
\mathrelationchain
{\relationchain
{ \operatorname{rk} \, M
}
{ = }{ n
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
}}
\factextra {}
}
{
The relation between rank, invertibility and linear independence was proven in
Corollary 26.4
.
Suppose now that the rows are
linearly dependent.
After exchanging rows, we may assume that
\mathrelationchain
{\relationchain
{ v_n
}
{ = }{\sum_{i = 1}^{n-1} s_i v_i
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Then, due to
Theorem 26.9
and
Theorem 26.10
,
we get
\mathrelationchaindisplay
{\relationchain
{ \det M
}
{ =} { \det \begin{pmatrix} v_1 \\ \vdots \\ v_{n-1}\\ \sum_{i = 1}^{n-1} s_i v_i \end{pmatrix}
}
{ =} { \sum_{i = 1}^{n-1} s_i \det \begin{pmatrix} v_1 \\ \vdots \\ v_{n-1}\\ v_i \end{pmatrix}
}
{ =} { 0
}
{ } {
}
}
{}{}{.}
Now suppose that the rows are linearly independent. Then, by exchanging of rows, scaling and addition of a row to another row, we can transform the matrix successively into the identity matrix. During these manipulations, the determinant is multiplied with some factor $\neq 0$. Since the determinant of the identity matrix is $1$, the determinant of the initial matrix is $\neq 0$.
\inputremark {}
{
\image{ \begin{center}
\includegraphics[width=5.5cm]{\imageinclude {Determinant_parallelepiped.svg} }
\end{center}
\imagetext {} }
\imagelicense { Determinant parallelepiped.svg } {} {Claudio Rocchini} {Commons} {CC-by-sa 3.0} {}
In case
\mathrelationchain
{\relationchain
{K
}
{ = }{\R
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the
determinant
is in tight relation to volumes of geometric objects. If we consider in $\R^n$ vectors \mathl{v_1 , \ldots , v_n}{,} then they span a \keyword {parallelotope} {.} This is defined by
\mathrelationchaindisplay
{\relationchain
{P
}
{ \defeq} { { \left\{ s_1v_1 + \cdots + s_n v_n \mid s_i \in [0,1] \right\} }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
It consists of all
linear combinations
of these vectors, where all the scalars belong to the unit interval. If the vectors are linearly independent, then this is a \quotationshort{voluminous}{} body, otherwise it is an object of smaller dimension. Now the relation
\mathrelationchaindisplay
{\relationchain
{ \operatorname{vol} \, P
}
{ =} { \betrag { \det { \left( v_1 , \ldots , v_n \right) } }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds, saying that the volume of the parallelotope is the modulus of the determinant of the matrix, consisting of the spanning vectors as columns.
}
\subtitle {The multiplication theorem for determinants}
We discuss without proofs further important theorems about the determinant. The proofs rely on a systematic account of the properties which are characteristic for the determinant, namely the properties multilinear and alternating. By these properties, together with the condition that the determinant of the identity matrix is $1$, the determinant is already determined.
\inputfaktbeweisnichtvorgefuehrt
{Determinant/Multiplication theorem/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a field, and
\mathrelationchain
{\relationchain
{n
}
{ \in }{\N_+
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}}
\factconclusion {Then for matrices
\mathrelationchain
{\relationchain
{A,B
}
{ \in }{\operatorname{Mat}_{ n } (K)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the relation
\mathrelationchaindisplay
{\relationchain
{ \det { \left( A \circ B \right) }
}
{ =} { \det A \cdot \det B
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
holds.}
\factextra {}
}
{
We fix the matrix $B$.
Suppose first that
\mathrelationchain
{\relationchain
{ \det B
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Then, due to
Theorem 26.11
the matrix $B$ is not
invertible
and therefore, also \mathl{A \circ B}{} is not invertible. Hence,
\mathrelationchain
{\relationchain
{ \det A \circ B
}
{ = }{ 0
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
Suppose now that $B$ is invertible. In this case, we consider the well-defined mapping
\mathdisp {\delta \colon \operatorname{Mat}_{ n } (K) \longrightarrow K
, A \longmapsto (\det A \circ B ) ( \det B )^{-1}} { . }
We want to show that this mapping equals the mapping \mathl{A \mapsto \det A}{,} by showing that it fulfills all the properties which, according to
Fact *****,
characterize the determinant. If \mathl{z_1 , \ldots , z_n}{} denote the rows of $A$, then \mathl{\delta(A)}{} is computed by applying the determinant to the rows \mathl{z_1 B , \ldots , z_n B}{,} and then by multiplying with \mathl{(\det B )^{-1}}{.} Hence the multilinearity and the alternating property follows from
exercise *****.
If we start with
\mathrelationchain
{\relationchain
{ A
}
{ = }{ E_{ n }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
then
\mathrelationchain
{\relationchain
{ A \circ B
}
{ = }{ B
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
and thus
\mathrelationchaindisplay
{\relationchain
{ \delta( E_{ n } )
}
{ =} { (\det B ) \cdot (\det B )^{-1}}
{ } {
}
{ =} { 1
}
{ } {
}
}
{}{}{.}
\inputdefinition
{ }
{
Let $K$ be a
field,
and let
\mathrelationchain
{\relationchain
{ M
}
{ = }{( a_{ i j } )_{ i j }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
be an
$m \times n$-matrix
over $K$. Then the \mathl{n \times m}{-}matrix
\mathdisp {{ M^{ \text{tr} } } ={ \left( b_{ij} \right) }_{ij} \text{ with } b_{ij} := a_{ji}} { }
}
The transposed matrix arises by interchanging the role of the rows and the columns. For example, we have
\mathrelationchaindisplay
{\relationchain
{ { \begin{pmatrix} t & n & o & d \\ r & s & s & x \\ a & p & e & y \end{pmatrix} ^{ \text{tr} } }
}
{ =} { \begin{pmatrix} t & r & a \\ n & s & p \\ o & s & e \\ d & x & y \end{pmatrix}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
\inputfaktbeweisnichtvorgefuehrt
{Determinant/Transposed matrix/Universal property/Fact}
{Theorem}
{}
{
\factsituation {Let $K$ denote a
field,
and let $M$ denote an
$n \times n$-matrix
over $K$.}
\factconclusion {Then
\mathrelationchaindisplay
{\relationchain
{ \det M
}
{ =} { \det { M^{ \text{tr} } }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {}
}
{
If $M$ is not invertible, then, due to
Theorem 26.11
,
the determinant is $0$ and the rank is smaller than $n$. This does also hold for the transposed matrix, so that its determinant is again $0$. So suppose that $M$ is invertible. We reduce the statement in this case to the corresponding statement for the elementary matrices, which can be verified directly, see
exercise *****.
Because of
Fact *****,
there exist
elementary matrices
\mathl{E_1 , \ldots , E_s}{} such that
\mathrelationchaindisplay
{\relationchain
{ D
}
{ =} { E_s \cdots E_1 M
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
is a
diagonal matrix.
Due to
exercise *****,
we have
\mathrelationchaindisplay
{\relationchain
{ { D^{ \text{tr} } }
}
{ =} { { M^{ \text{tr} } } { E_1^{ \text{tr} } } \cdots { E_s^{ \text{tr} } }
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
and
\mathrelationchaindisplay
{\relationchain
{ { M^{ \text{tr} } }
}
{ =} { { D^{ \text{tr} } } ( { E_s^{ \text{tr} } } )^{-1} \cdots ( { E_1^{ \text{tr} } } )^{-1}
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{.}
The diagonal matrix $D$ is not changed under transposing it. Since the determinants of the elementary matrices are also not changed under transposition, we get, using
Fact *****,
\mathrelationchainalign
{\relationchainalign
{ \det { M^{ \text{tr} } }
}
{ =} { \det { \left( { D^{ \text{tr} } } ( { E_s^{ \text{tr} } } )^{-1} \cdots ( { E_1^{ \text{tr} } } )^{-1} \right) }
}
{ =} { \det { D^{ \text{tr} } } \cdot \det ( { E_s^{ \text{tr} } } )^{-1} \cdots \det ( { E_1^{ \text{tr} } } )^{-1}
}
{ =} { \det D \cdot \det { \left( E_s^{-1} \right) } \cdots \det { \left( E_1^{-1} \right) }
}
{ =} { \det { \left( E_1^{-1} \right) } \cdots \det { \left( E_s^{-1} \right) } \cdot \det D \cdot
}
}
{
\relationchainextensionalign
{ =} { \det { \left( E_1 ^{-1} \cdots E_s^{-1} \cdot D \right) }
}
{ =} { \det M
}
{ } {}
{ } {}
}
{}{.}
This implies that we can compute the determinant also by expanding with respect to the rows, as the following statement shows.
\inputfactproof
{Determinant/Laplace expansion/Fact}
{Corollary}
{}
{
\factsituation {Let $K$ be a
field,
and let
\mathrelationchain
{\relationchain
{ M
}
{ = }{ { \left( a_{ i j } \right) }_{ i j }
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
be an
$m \times n$-matrix
over $K$. For
\mathrelationchain
{\relationchain
{ i,j
}
{ \in }{ \{ 1 , \ldots , n \}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
let \mathl{M_{ij}}{} be the matrix which arises from $M$, by leaving out the $i$-th row and the $j$-th column.}
\factconclusion {Then
\extrabracket {for
\mathrelationchain
{\relationchain
{ n
}
{ \geq }{ 2
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
and for every fixed $i$ and $j$} {} {}
\mathrelationchaindisplay
{\relationchain
{ \det M
}
{ =} { \sum_{ i = 1 }^{ n } (-1)^{i+j} a_{ij} \det M_{ij}
}
{ =} { \sum_{ j = 1 }^{ n } (-1)^{i+j} a_{ij} \det M_{ij}
}
{ } {
}
{ } {
}
}
{}{}{.}}
\factextra {}
}
{
For
\mathrelationchain
{\relationchain
{j
}
{ = }{1
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{,}
the first equation is the recursive definition of the
determinant.
From that statement, the case
\mathrelationchain
{\relationchain
{i
}
{ = }{1
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
follows,
due to
Theorem 26.15
.
By exchanging columns and rows, the statement follows in full generality, see
Exercise 26.13
.
\subtitle {The determinant of a linear mapping}
Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
be a linear mapping from a vector space of dimension $n$ into itself. This is described by a matrix
\mathrelationchain
{\relationchain
{M
}
{ \in }{ \operatorname{Mat}_{ n } (K)
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{}
with respect to a given basis. We would like to define the determinant of the linear mapping, by the determinant of the matrix. However, we have here the problem whether this is \keyword {well-defined} {,} since a linear mapping is described by quite different matrices, with respect to different bases.
But, because of
Corollary 25.9
,
when we have two describing matrices
\mathcor {} {M} {and} {N} {,}
and the matrix $B$ for the change of bases, we have the relation
\mathrelationchain
{\relationchain
{N
}
{ = }{ BMB^{-1}
}
{ }{
}
{ }{
}
{ }{
}
}
{}{}{.}
The multiplication theorem for determinants yields then
\mathrelationchaindisplay
{\relationchain
{ \det N
}
{ =} { \det { \left( BMB^{-1} \right) }
}
{ =} { ( \det B) (\det M) { \left( \det B^{-1} \right) }
}
{ =} { ( \det B) { \left( \det B^{-1} \right) } (\det M)
}
{ =} { \det M
}
}
{}{}{,}
so that the following definition is in fact independent of the basis chosen.
\inputdefinition
{ }
{
Let $K$ denote a
field,
and let $V$ denote a
$K$-vector space
of finite dimension. Let
\mathdisp {\varphi \colon V \longrightarrow V} { }
be a
linear mapping,
which is described by the
matrix
$M$, with respect to a
basis.
Then
\mathrelationchaindisplay
{\relationchain
{ \det \varphi
}
{ \defeq} { \det M
}
{ } {
}
{ } {
}
{ } {
}
}
{}{}{}
}