Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I/Lecture 23

From Wikiversity
Jump to navigation Jump to search

The solution set of a homogeneous system of linear equations in variables over a field is a linear subspace of . Quite often, this solution space is described as the set of all "linear combinations“ of finitely many (simple) solutions. In this lecture, we develop the concepts to make this precise.



Generating systems
The plane generated by two vectors and consists of all linear combinations .

Definition  

Let be a field, and let be a -vector space. Let denote a family of vectors in . Then the vector

is called a linear combination of this vectors

(for the coefficient tuple ).

Two different coefficient tuples can define the same vector.


Definition  

Let be a field, and let be a -vector space. A family , , is called a generating system (or spanning system) of , if every vector can be written as

with a finite subfamily , and with

.

In , the standard vectors , , form a generating system. In the polynomial ring , the powers , , form an (infinite) generating system.


Definition  

Let be a field, and let be a -vector space. For a family , , we set

and call this the linear span of the family, or the generated linear subspace.

The empty set generates the null space.[1] The null space is also generated by the element . A single vector spans the space . For , this is a line, a term we will make more precise in the framework of dimension theory. For two vectors and , the "form“ of the spanned space depends on how the two vectors are related to each other. If they both lie on a line, say , then is superfluous, and the linear subspace generated by the two vectors equals the linear subspace generated by . If this is not the case (and and are not ), then the two vectors span a "plane“.

We list some simple properties for generating systems and linear subspaces.


Lemma

Let be a field, and let be a

-vector space. Then the following statements hold.
  1. For a family , , of elements in , the linear span is a linear subspace of .
  2. The family , , is a spanning system of , if and only if

Proof



Linear independence

Definition  

Let be a field, and let be a -vector space. A family of vectors , , (where denotes a finite index set) is called linearly independent, if an equation of the form

is only possible when

for all .

If a family is not linear independent, then it is called linearly dependent. A linear combination is called a representation of the null vector. It is called the trivial representation, if all coefficients equal , and, if at least one coefficient is not , a nontrivial representation of the null vector. A family of vectors is linearly independent, if and only if one can represent with it the null vector only in the trivial way. This is equivalent with the property that no vector of the family can be expressed as a linear combination by the others.


Example

The standard vectors in are linearly independent. A representation

just means

The -th row yields directly .


Example

The three vectors

are linearly dependent. The equation

is a nontrivial representation of the null vector.


Lemma

Let be a field, let be a -vector space, and let , ,

be a family of vectors in . Then the following statements hold.
  1. If the family is linearly independent, then for each subset , also the family  , , is linearly independent.
  2. The empty family is linearly independent.
  3. If the family contains the null vector, then it is not linearly independent.
  4. If a vector appears several times in the family, then the family is not linearly independent.
  5. A single vector is linearly independent if and only if .
  6. Two vectors and are linearly independent if and only if is not a scalar multiple of and vice versa.

Proof



Remark

The vectors are linearly dependent, if and only if the homogeneous linear system

has a nontrivial solution.



Basis

Definition  

Let be a field, and let be a -vector space. Then a linearly independent generating system , ,

of is called a basis of .

Example

The standard vectors in form a basis. The linear independence was shown in Example 23.6 . To show that they also form a generating system, let

be an arbitrary vector. Then we have immediately

Hence, we have a basis, which is called the standard basis of .


Theorem

Let be a field, and let be a -vector space. Let

be a family of vectors. Then the following statements are equivalent.
  1. The family is a basis of .
  2. The family is a minimal generating system, that is, as soon as we remove one vector , the remaining family is not a generating system any more.
  3. For every vector , there is exactly one representation
  4. The family is maximal linearly independent, that is, as soon as some vector is added, the family is not linearly independent any more.

Proof

This proof was not presented in the lecture.



Remark

Let a basis of a -vector space be given. Due to Theorem 23.12 , this means that for every vector , there exists a uniquely determined representation

The elements (scalars) are called the coordinates of with respect to the given basis. Thus, for a fixed basis, we have a (bijective) correspondence between the vectors from , and the coordinate tuples . We express this by saying that a basis determines a linear coordinate system.[2]


Theorem

Let be a field, and let be a -vector space with a finite generating system. Then has a finite basis.

Proof  

Let , , be a finite generating system of , with a finite index set . We argue with the characterization from Theorem 23.12 . If the family is minimal, then we have a basis. If not, then there exists some , such that the remaining family where is removed, that is , , is also a generating system. In this case, we can go on with this smaller index set. With this method, we arrive at a subset such that , , is a minimal generating set, hence a basis.



Dimension theory

A finitely generated vector space has many quite different bases. However, the number of elements in a basis is constant and depends only on the vector space. We will formulate this important property now and take it as the departure for the definition of dimension of a vector space.


Theorem

Let be a field, and let be a -vector space with a finite generating system. Then any two bases of have the same number of vectors.

Proof

This proof was not presented in the lecture.


This theorem enables the following definition.


Definition  

Let be a field, and let be a -vector space with a finite generating system. Then the number of vectors in any basis of , is called the dimension of , written

Due to the preceding theorem, the dimension is well-defined. If a vector space is not finitely generated, then one puts . The null space has dimension . A one-dimensional vector space is called a line, a two-dimensional vector space a plane, a three-dimensional vector space a space (in the strict sense) but every vector space is called a space.


Corollary

Let be a field and . Then the standard space has the dimension .

Proof  

The standard basis , , consists of vectors, hence the dimension is .



Example

The complex numbers form a two-dimensional real vector space, a basis is and .


Example

The polynomial ring over a field is not a finite-dimensional vector space. To see this, we have to show that there is no finite generating system for the polynomial ring. Consider polynomials . Let be the maximum of the degrees of these polynomials. Then every -linear combination has at most degree . In particular, polynomials of larger degree can not be presented by , so these do not form a generating system for all polynomials.


Corollary

Let denote a finite-dimensional vector space over a field . Let denote a linear subspace. Then is also finite-dimensional, and the estimate

holds.

Proof

This proof was not presented in the lecture.



Corollary

Let be a field, and let be a -vector space with finite dimension . Let vectors in be given. Then the following properties are equivalent.

  1. form a basis of .
  2. form a generating system of .
  3. are linearly independent.

Proof



Example

Let be a field. It is easy to get an overview over the linear subspaces of , as the dimension of a linear subspace equals  with , due to Corollary 23.20 . For , there is only the null space itself, for there is the null space and itself. For , there is the null space, the whole plane , and the one-dimensional lines through the origin. Every line has the form

with a vector . Two vectors different from define the same line if and only if they are linearly dependent. For , there is the null space, the whole space , the one-dimensional lines through the origin, and the two-dimensional planes through the origin.


Theorem

Let denote a finite-dimensional vector space over a field . Let

denote linearly independent vectors in . Then there exist vectors

such that

form a basis of .

Proof

This proof was not presented in the lecture.




Footnotes
  1. This follows from the definition, if we use the convention that the empty sum equals .
  2. Linear coordinates give a bijective relation between points and number tuples. Due to linearity, such a bijection respects the addition and the scalar multiplication. In many different contexts, also nonlinear (curvilinear) coordinates are important. These put points of a space and number tuples into a bijective relation. Examples are polar coordinates, cylindrical coordinates and spherical coordinates. By choosing suitable coordinates, mathematical problems, like the computation of volumes, can be simplified.


<< | Mathematics for Applied Sciences (Osnabrück 2023-2024)/Part I | >>
PDF-version of this lecture
Exercise sheet for this lecture (PDF)