Tensors/Definitions
Subject classification: this is a mathematics resource. |
Subject classification: this is a physics resource. |
Educational level: this is a secondary education resource. |
Educational level: this is a tertiary (university) resource. |
- In this article, all vector spaces are real and finite-dimensional.
A tensor is a concept from mathematical physics that can be thought of as a generalization of a vector. While tensors can be defined in a purely mathematical sense, they are most useful in connection with vectors in physics.
Definition
[edit | edit source]
A tensor is a real-valued function of some number of vectors and/or linear forms, which is linear in each of its arguments. |
These terms need to be explained carefully. If a tensor is a function of some arguments, one might see things like
Since it is real-valued, we might have an equation like
The rank of a tensor is the number of arguments. The tensor in this example is a 4th rank tensor.
We will use capital Roman letters to denote vector arguments, for example, V, W, and X. We will use capital Greek letters to denote form arguments, for example, . We will explain what forms are presently. For now, don't worry about it. They are like vectors. In fact, in a way that will be explained later, they actually are vectors, but in a different vector space.
If all of the arguments are vectors, the tensor is said to be a covariant tensor. If they are all forms, it is a contravariant tensor. Otherwise, it is a mixed tensor. The tensor in the example above is mixed. In fact, we could say that its rank of covariance is 3 and its rank of contravariance is 1. A zeroth-rank tensor, that is, a function of no arguments, is just a number. It is properly called a scalar, and is a completely legitimate tensor.
Being linear is crucial. The function must be linear in each of its arguments. This means that, if any argument is multiplied by a scalar, the result will be multiplied by that scalar:
Also, if an argument is the sum of two things, the result is the sum of the resulting functions of the two things.
These two facts are often combined into one formula:
This property must hold for each argument, so, for example:
We can now define a linear form. It is just a real-valued linear function of one vector argument. That means that it is a 1st rank covariant tensor[1]. If a form is linear, that means that, for any vectors V and W and any number :
and
The space of forms is a vector space—we have the notion of multiplying a form by a number, and of adding two forms.
and
Now a 1st rank contravariant tensor is just a vector. It is by definition a linear function that takes forms as its argument. That that is the same as a vector will be shown in the next article. Accepting that, we have:
- A 1st rank contravariant tensor is a vector. It is also a linear real-valued function that takes a linear form as its argument. (Shown in the next article.)
- A 1st rank covariant tensor is a form. It is also a linear real-valued function that takes a vector as its argument. (This one is obvious.)
One occasionally sees names for higher-rank non-mixed tensors. Tensors that are purely covariant are given names: bilinear form for a function of two vectors, trilinear form for a function of three vectors, quadrilinear form, and, in general, multilinear form. These are based on Latin words.
Tensors that are purely contravariant are also given names: dyadic for a function of two forms, triadic for a function of three forms, tetradic, pentadic and, in general, polyadic. These are based on Greek words.
There is one other extremely important type of tensor—2nd rank mixed. This is a linear operator or linear transformation. How this comes about is an important trick of tensor algebra.
Suppose we have an operator A, that is, a function that takes a vector argument and yields a vector result:
We need to find a mixed tensor T that is the same thing. T must be a function that takes two arguments—a form and a vector. Define it this way:
Conversely, if we are given a mixed tensor T, it takes two arguments—a form and a vector. Given a vector V, we can define a function Q that takes a form as its argument, thusly:
Since Q takes a form as its argument, it is a 1st rank contravariant tensor, which is a vector, as shown in the next article.
Symmetry
[edit | edit source]A tensor is symmetric in a given pair of arguments if it gives the same result when those two arguments are switched with each other. This is similar to the commutative property of addition. A tensor is antisymmetric if it gives the negative of the result when the two arguments are switched. This is similar to the anticommutative property of subtraction. Symmetry and antisymmetry are only meaningful if the arguments being swapped are of the same type—both covariant or both contravariant[2].
Some famous tensors
[edit | edit source]The metric tensor, or just the metric, 2nd rank covariant symmetric, is very commonly used to define the "inner product" or "dot product" of two vectors. Being a symmetric bilinear function of two vectors, it is just the right thing for defining a dot product.
The Levi-Civita tensor, Nth rank covariant, completely antisymmetric (N is the dimension of the space), measures the area/volume/hypervolume of the rectangle/parallelogram/parallelopiped/parallelotope spanned by N vectors. It has the unusual property of having its sign depend on the "handedness" of the space. As such, it is a pseudo-tensor.
Faraday's tensor, 2nd rank contravariant antisymmetric, is the tensor that explains electrodynamics and Maxwell's Equations in 4-dimensional relativistic spacetime. Its components are the components of the classical electric and magnetic fields.
The stress tensor, 2nd rank covariant symmetric, is the tensor in 3 dimensions that describes the mechanical stresses on an object.
The Maxwell stress tensor, 2nd rank contravariant symmetric, is the tensor in 3 dimensions that describes the classical stress of the electric and magnetic fields. The Electromagnetic stress-energy tensor generalizes this to 4 dimensions in relativity and describes the energy and momentum densities of the electromagnetic field.
Riemann's tensor, 4th rank mixed, is made from the derivatives (gradients) of the metric tensor in different parts of space (that is, a tensor field), and describes the curvature of the space.
The stress-energy-momentum tensor 2nd rank covariant symmetric, is the tensor in 4-dimensional relativistic spacetime that describes all the stresses, forces, momenta, matter, and energy.
Ricci's tensor and Einstein's tensor, 2nd rank covariant symmetric, are simplified versions of Riemann's tensor, describe the curvature of spacetime, and make General relativity work. The principle of gravitation in General relativity simply says that Einstein's tensor is 8πK times the stress-energy-momentum tensor, where K is Newton's constant of gravitation.
The next article in this series is Tensors/Bases, components, and dual spaces.
See also
[edit | edit source]- Tensors/Bases, components, and dual spaces
- Tensors/Calculations with index notation
- Tensors/Transformation rule under a change of basis