Measure Theory/Integrating Derivatives
Integrating Derivatives
[edit | edit source]We now turn, as promised before, to the task of finding conditions in which the second part of the FTC holds. That is to say, we find when it makes sense to assert
However, we first will need a detour through monotone functions, and then functions of bounded variation. It will not initially be apparent how these topics relate to our goal in this section, so I have to only promise that eventually, they will.
That said, it is not as though this alternate line of inquiry "comes out of nowhere". As I will try to describe briefly below, the original motivation was to answer important applied questions about Fourier series.
Variation
[edit | edit source]Fourier and Monotone Functions
[edit | edit source]It is yet another opportunity to marvel at just how much of modern mathematics is an inheritance of Fourier series.
At some point, the mathematician Dirichlet discovered that a function is equal to its Fourier series on the condition that it is monotonic. This caused mathematicians to become further interested in what else can be accomplished by studying monotone functions.
It was pretty immediately apparent that piecewise monotone functions has similar properties regarding the convergence of Fourier series.
Roughly speaking, one only has to find the Fourier series on each interval where the function is monotonic. Then one may combine the various Fourier series in a natural way.
Subsequent to all of this, mathematicians realized that monotonicity is sufficient to prove that a function is differentiable almost everywhere. The proof of this fact will important to us and therefore we will spend a lesson proving it.
Variation
[edit | edit source]Decades after Dirichlet, the mathematician Jordan realized that we needn't stop at monotonicity. In fact we could "push" the concept of piecewise monotonicity to a new extreme. This new extreme is the concept of a function of "bounded variation".
Let us approach the concept of bounded variation by trying to invent it ourselves, through a few considerations.
Let us fix the interval [0,1] and imagine just how badly a function may oscillate from increasing to decreasing, on this interval. Of course it may do so any finite number of times, and it is not too hard to come up with simple examples of a function switching from increasing to decreasing n times for any n.
In fact it is not very hard to construct an example of a function switching from increasing to decreasing any countable number of times. The reader is encouraged to do so now, if she is so inclined.
But the fact that a function may be so pathologically oscillating, then threatens that we may not be able to write it as a sum of an increasing part and a decreasing part -- which would then block our ability to say that it equals its Fourier series.
The reader may be interested, at this moment, to consider the extreme case of the Dirichlet function, . On any non-degenerate interval, this function in a sense "oscillates" infinitely often, between 0 and 1.
What Jordan realized is that we would like to, in some sense, "infinitely partition" a given function, and try to capture a notion of the change in the function, either in the positive or negative direction.
It is possible to do this in a fairly literal sense. One may take any partition of the compact interval [a,b], which we write as . Then define the sum of the positive changes, , and then define the positive variation as the supremum taken over all partitions,
- .
If we then use this to define a function of x by letting the end of the interval be a variable, , then this function P essentially "increases in the same way that f increases".
One could then go on to define the negative variation of f and use this to try to split f into the positive and negative variation parts.
However, it will simplify our work to not have two different objects, the positive and the negative variation. For most of our work, we can accomplish all of the same goals using just a single object, the total variation.
Definition: variation
Let be any function, and let be any partition of the interval [a,b]. Define the variation of f on P by
Then define the variation of f on [a,b] by
We say that f has bounded variation on [a,b] if is a finite real number.
Define the function, the total variation of f, by taking the end-point to be a variable x.
Exercise 1. Dirichlet Infinite Variation
|
Exercise 2. TV Increasing
|
Bounded Variation and Up-Down Decomposition
[edit | edit source]What Jordan found is that, if a function has bounded variation, then it always has an increasing part and a decreasing part. In fact, it turns out that we can say even more than this.
Definition: increasing, decreasing parts
Let be any function defined on a subset . Suppose that there exist two monotonically increasing functions, , such that .
Then U is called the increasing part and D is the decreasing part of f. This is often called the Jordan decomposition of the function f although I will prefer the more descriptive up-down decomposition.
Note that both parts are monotonically increasing, but because we subtract D, this "accounts" for the portion of f which is decreasing.
Also notice that, so long as the total variation function is finite at every point, then we have the equality
and therefore, so long as is monotonically increasing, then this will be the up-down decomposition that we were seeking.
Exercise 4. BV Iff Up Down Decomp
Theorem: Let be any function. Then f has bounded variation if and only if there exist two monotonically increasing functions, such that . 1. Assume that f has bounded variation. We will set U = T, as we have noted above, and . So all that we must do is prove that is monotonically increasing. By definition, you need to show that if then . It is natural to re-arrange this with similar terms grouped together. It should make some sense to expect that . But rather than going straight for this result, it may be easier to prove a lemma first: For any we have . It should also be a one- or two-line proof to show that . 2. Now assume f has the up-down decomposition given by U and D, and show that f has bounded variation. Hint: Start by considering any partition P, and the variation . Now split this up into positive and negative variation, as described at the start of the subsection Variation. Show that the positive variation is upper-bounded by and from here the rest of the solution may be clear enough. |
Dini Derivatives
[edit | edit source]Finally I want to introduce a concept which will allow for a certain amount of simplification later. First let's establish some notation and make a few observations:
Definition: difference quotient
Let be a function defined on a subset . Let . We say that the difference quotient of f at x is given by
The derivative of f at x is then the limit as h goes to zero in the difference quotient.
However, there are a number of different ways that this limit may fail to exist.
- The limit may be infinity.
- The left-handed limit may exist but the right-handed limit may be infinity.
- The two handed limits may exist but fail to equal each other.
- The left-handed limit may not be infinity but it may still exist.
And so on, with many other variations.
Our proofs in the next several lessons will become simpler if we instead focus on the limsup of the difference quotient, rather than the limit. This is because the limsup is guaranteed to exist, if we count "being infinite" as an extended kind of "existence".
We will similarly also consider the liminf, and handed limits, each separately. The derivative then exists when all four of these quantities are equal and nonzero.
Definition: upper-, lower-, left-, right-derivatives
The upper-left derivative of f at x is defined as
Similarly we define the lower-left, upper-right, lower-right derivatives by the following, respectively.
These are collectively often called Dini derivatives.
Exercise 5. Dini Derivatives Exist
Infer that the remaining Dini derivatives also exist or are infinite. |