A
system of linear equations
can easily be written with a matrix. This allows us to make the manipulations which lead to the solution of such a system, without writing down the variables. Matrices are quite simple objects; however, they can represent quite different mathematical objects
(e.g., a family of column vectors, a family of row vectors, a linear mapping, a table of physical interactions, a vector field, etc.),
which one has to keep in mind in order to prevent wrong conclusions.
Let
denote a
field,
and let
and
denote index sets. An
-matrix is a
mapping
-
If
and
,
then we talk about an
-matrix. In this case, the matrix is usually written as
-
We will usually restrict to this situation. For every
,
,
,
is called the
-th row of the matrix, which is usually written as a row vector
-
For every
,
,
,
is called the
-th column of the matrix, usually written as a column vector
-
The elements
are called the entries of the matrix. For
, the number
is called the row index, and
is called the column index of the entry. The position of the entry
is where the
-th row meets the
-th column. A matrix with
is called a square matrix. An
-matrix is simply a column tuple
(or column vector)
of length
, and an
-matrix is simply a row tuple
(or row vector)
of length
. The set of all matrices with
rows and
columns
(and with entries in
)
is denoted by
, in case
we also write
.
Two matrices
are added by adding entries with corresponding entries. The multiplication of a matrix
with an element
(a scalar) is also defined entrywise, so
-
![{\displaystyle {}{\begin{pmatrix}a_{11}&a_{12}&\ldots &a_{1n}\\a_{21}&a_{22}&\ldots &a_{2n}\\\vdots &\vdots &\ddots &\vdots \\a_{m1}&a_{m2}&\ldots &a_{mn}\end{pmatrix}}+{\begin{pmatrix}b_{11}&b_{12}&\ldots &b_{1n}\\b_{21}&b_{22}&\ldots &b_{2n}\\\vdots &\vdots &\ddots &\vdots \\b_{m1}&b_{m2}&\ldots &b_{mn}\end{pmatrix}}={\begin{pmatrix}a_{11}+b_{11}&a_{12}+b_{12}&\ldots &a_{1n}+b_{1n}\\a_{21}+b_{21}&a_{22}+b_{22}&\ldots &a_{2n}+b_{2n}\\\vdots &\vdots &\ddots &\vdots \\a_{m1}+b_{m1}&a_{m2}+b_{m2}&\ldots &a_{mn}+b_{mn}\end{pmatrix}}\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dcd6b3f69b95d5d823824502ce4c71047d3779c6)
and
-
![{\displaystyle {}r{\begin{pmatrix}a_{11}&a_{12}&\ldots &a_{1n}\\a_{21}&a_{22}&\ldots &a_{2n}\\\vdots &\vdots &\ddots &\vdots \\a_{m1}&a_{m2}&\ldots &a_{mn}\end{pmatrix}}={\begin{pmatrix}ra_{11}&ra_{12}&\ldots &ra_{1n}\\ra_{21}&ra_{22}&\ldots &ra_{2n}\\\vdots &\vdots &\ddots &\vdots \\ra_{m1}&ra_{m2}&\ldots &ra_{mn}\end{pmatrix}}\,.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/3d48c7198fac3b71bd0849018de89098d6a53651)
The multiplication of matrices is defined in the following way.
Such a matrix multiplication is only possible when the number of columns of the left-hand matrix equals the number of rows of the right-hand matrix. Just think of the scheme
-
![{\displaystyle {}(ROWROW){\begin{pmatrix}C\\O\\L\\U\\M\\N\end{pmatrix}}=(RC+O^{2}+WL+RU+OM+WN)\,,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/ca18727417ace980968b8b0c4419c72a0dab07f2)
the result is an
-Matrix. In particular, one can multiply an
-matrix
with a column vector of length
(the vector on the right),
and the result is a column vector of length
. The two matrices can also be multiplied with roles interchanged,
-
![{\displaystyle {}{\begin{pmatrix}C\\O\\L\\U\\M\\N\end{pmatrix}}(ROWROW)={\begin{pmatrix}CR&CO&CW&CR&CO&CW\\OR&O^{2}&OW&OR&O^{2}&OW\\LR&LO&LW&LR&LO&LW\\UR&UO&UW&UR&UO&UW\\MR&MO&MW&MR&MO&MW\\NR&NO&NW&NR&NO&NW\end{pmatrix}}\,.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/fea736d10fb1f95528b36a6643dbe7ec158ea186)
An
-matrix
of the form
-
is called a
diagonal matrix.
The
-matrix
-
![{\displaystyle {}E_{n}:={\begin{pmatrix}1&0&\cdots &\cdots &0\\0&1&0&\cdots &0\\\vdots &\ddots &\ddots &\ddots &\vdots \\0&\cdots &0&1&0\\0&\cdots &\cdots &0&1\end{pmatrix}}\,}](https://wikimedia.org/api/rest_v1/media/math/render/svg/89b5aaabe2093730b02b4364cdf001f5e6709387)
is called
identity matrix.
The identity matrix
has the property
,
for an arbitrary
-matrix
.