Inner product/K/Orthogonal basis/Introduction/Section
Let be a -vector space, endowed with an inner product. A basis , , of is called an orthogonal basis if
Let be a -vector space, endowed with an inner product . A basis , , of is called an orthonormal basis if
The elements in an orthonormal basis have norm , and they are orthogonal to each other. Hence, a orthonormal basis is a orthogonal basis, that also satisfies the norm condition
It is easy to transform an orthogonal basis to an orthonormal basis, by replacing every by its normalization (as is part of a basis, its norm is not ). A family of vectors, all of norm , and orthogonal to each other, but not necessarily a basis, is called an orthonormal system.
Let be a -vector space, endowed with an inner product, and let , , be an orthonormal basis of . Then the coefficients of a vector , with respect to this basis, are given by
Since we have a basis, there exists a unique representation
(where all are up to finitely many). Therefore, the claim follows from
We will mainly consider orthonormal bases in the finite-dimensional case. In , the standard basis is a orthonormal basis. In the plane , any orthonormal basis is of the form or of the form , where
holds. For example, is an orthonormal basis. The following Gram–Schmidt orthonormalization describes a method to construct, starting with any basis of a finite-dimensional vector space, an orthonormal basis that generates the same
flag
of linear subspaces.
Let be a finite-dimensional -vector space, endowed with an inner product, and let be a basis of . Then there exists an orthonormal basis of with[1]
for alle
.We prove the statement by induction over , that is, we construct successively a family of orthonormal vectors spanning the same linear subspaces. For , we just have to normalize , that is, we replace it by . Now suppose that the statement is already proven for . Let a family of orthonormal vectors fulfilling be already constructed. We set
Due to
this vector is orthgonal to all , and also
holds. By normalizing , we obtain .
Let be the kernel of the linear mapping
As a linear subspace of , carries the induced inner product. We want to determine an orthonormal basis of . For this, we consider the basis consisting of the vectors
We have ; therefore,
is the corresponding normed vector. According to[2] orthonormalization process, we set
We have
Therefore,
is the second vector of the orthonormal basis.
Let be a finite-dimensional -vector space, endowed with an inner product. Then there exists an orthonormal basis
of .This follows directly from fact.
Therefore, in a finite-dimensional vector space with an inner product, one can always extend a given orthonormal system to a orthonormal basis, see
exercise.
Let be a finite-dimensional -vector space, endowed with an inner product, and let denote a linear subspace. Then, we have
that is, is the direct sum of and its
orthogonal complement.From we get directly
thus . This means that the sum direct. Let be an orthonormal basis of . We extend it to an orthonormal basis of . Then we have
Therefore, is the sum of the linear subspaces.
For the following statement, compare also
fact
and
exercise.
Let be a -vector space, endowed with an inner product
. Then the following statements hold.- For a
linear subspace
,
we have
- We have and .
- Let be
finite-dimensional.
Then we have
- Let be finite-dimensional. Then we have
Proof
- ↑ Here, denotes the linear subspace spanned by the vectors, not the inner product.
- ↑ Often, it is computationally better, first to orthogonalize and to normalize at the very end, see example.