Determinant/Field/Recursively/Multilinearity/No proof/Section

We want to show that the recursively defined determinant is a "multilinear“ and "alternating“ mapping, where we identify

${\displaystyle {}\operatorname {Mat} _{n}(K)\cong (K^{n})^{n}\,,}$

so a matrix is identified with the ${\displaystyle {}n}$-tuple of the rows of the matrix. We consider a matrix as a tuple of columns

${\displaystyle {\begin{pmatrix}v_{1}\\\vdots \\v_{n}\end{pmatrix}}}$

where the entries ${\displaystyle {}v_{i}}$ are row vectors of length ${\displaystyle {}n}$.

Theorem

Let ${\displaystyle {}K}$ be a field, and ${\displaystyle {}n\in \mathbb {N} _{+}}$. Then the determinant

${\displaystyle \operatorname {Mat} _{n}(K)=(K^{n})^{n}\longrightarrow K,M\longmapsto \det M,}$

is multilinear. This means that for every ${\displaystyle {}k\in \{1,\ldots ,n\}}$, and for every choice of ${\displaystyle {}n-1}$ vectors ${\displaystyle {}v_{1},\ldots ,v_{k-1},v_{k+1},\ldots ,v_{n}\in K^{n}}$, and for any ${\displaystyle {}u,w\in K^{n}}$, the identity

${\displaystyle {}\det {\begin{pmatrix}v_{1}\\\vdots \\v_{k-1}\\u+w\\v_{k+1}\\\vdots \\v_{n}\end{pmatrix}}=\det {\begin{pmatrix}v_{1}\\\vdots \\v_{k-1}\\u\\v_{k+1}\\\vdots \\v_{n}\end{pmatrix}}+\det {\begin{pmatrix}v_{1}\\\vdots \\v_{k-1}\\w\\v_{k+1}\\\vdots \\v_{n}\end{pmatrix}}\,}$

holds, and for ${\displaystyle {}s\in K}$, the identity

${\displaystyle {}\det {\begin{pmatrix}v_{1}\\\vdots \\v_{k-1}\\su\\v_{k+1}\\\vdots \\v_{n}\end{pmatrix}}=s\det {\begin{pmatrix}v_{1}\\\vdots \\v_{k-1}\\u\\v_{k+1}\\\vdots \\v_{n}\end{pmatrix}}\,}$

holds.

Proof

This proof was not presented in the lecture.
${\displaystyle \Box }$

Theorem

Let ${\displaystyle {}K}$ be a field, and ${\displaystyle {}n\in \mathbb {N} _{+}}$. Then the determinant

${\displaystyle \operatorname {Mat} _{n}(K)=(K^{n})^{n}\longrightarrow K,M\longmapsto \det M,}$
has the following properties.
1. If in ${\displaystyle {}M}$ two rows are identical, then ${\displaystyle {}\det M=0}$. This means that the determinant is alternating.
2. If we exchange two rows in ${\displaystyle {}M}$, then the determinant changes with factor ${\displaystyle {}-1}$.

Proof

This proof was not presented in the lecture.
${\displaystyle \Box }$

Theorem

Let ${\displaystyle {}K}$ be a field, and let ${\displaystyle {}M}$ denote an ${\displaystyle {}n\times n}$-matrix

over ${\displaystyle {}K}$. Then the following statements are equivalent.
1. We have ${\displaystyle {}\det M\neq 0}$.
2. The rows of ${\displaystyle {}M}$ are linearly independent.
3. ${\displaystyle {}M}$ is invertible.
4. We have ${\displaystyle {}\operatorname {rk} \,M=n}$.

Proof

The relation between rank, invertibility and linear independence was proven in fact. Suppose now that the rows are linearly dependent. After exchanging rows, we may assume that ${\displaystyle {}v_{n}=\sum _{i=1}^{n-1}s_{i}v_{i}}$. Then, due to fact and fact, we get

${\displaystyle {}\det M=\det {\begin{pmatrix}v_{1}\\\vdots \\v_{n-1}\\\sum _{i=1}^{n-1}s_{i}v_{i}\end{pmatrix}}=\sum _{i=1}^{n-1}s_{i}\det {\begin{pmatrix}v_{1}\\\vdots \\v_{n-1}\\v_{i}\end{pmatrix}}=0\,.}$

Now suppose that the rows are linearly independent. Then, by exchanging of rows, scaling and addition of a row to another row, we can transform the matrix successively into the identity matrix. During these manipulations, the determinant is multiplied with some factor ${\displaystyle {}\neq 0}$. Since the determinant of the identity matrix is ${\displaystyle {}1}$, the determinant of the initial matrix is ${\displaystyle {}\neq 0}$.

${\displaystyle \Box }$