Dini test

From The Right Wiki
Jump to navigationJump to search

In mathematics, the Dini and Dini–Lipschitz tests are highly precise tests that can be used to prove that the Fourier series of a function converges at a given point. These tests are named after Ulisse Dini and Rudolf Lipschitz.[1]

Definition

Let f be a function on [0,2π], let t be some point and let δ be a positive number. We define the local modulus of continuity at the point t by

ωf(δ;t)=max|ε|δ|f(t)f(t+ε)|

Notice that we consider here f to be a periodic function, e.g. if t = 0 and ε is negative then we define f(ε) = f(2π + ε). The global modulus of continuity (or simply the modulus of continuity) is defined by

ωf(δ)=maxtωf(δ;t)

With these definitions we may state the main results:

Theorem (Dini's test): Assume a function f satisfies at a point t that
0π1δωf(δ;t)dδ<.
Then the Fourier series of f converges at t to f(t).

For example, the theorem holds with ωf = log−2(1/δ) but does not hold with log−1(1/δ).

Theorem (the Dini–Lipschitz test): Assume a function f satisfies
ωf(δ)=o(log1δ)1.
Then the Fourier series of f converges uniformly to f.

In particular, any function that obeys a Hölder condition satisfies the Dini–Lipschitz test.

Precision

Both tests are the best of their kind. For the Dini-Lipschitz test, it is possible to construct a function f with its modulus of continuity satisfying the test with O instead of o, i.e.

ωf(δ)=O(log1δ)1.

and the Fourier series of f diverges. For the Dini test, the statement of precision is slightly longer: it says that for any function Ω such that

0π1δΩ(δ)dδ=

there exists a function f such that

ωf(δ;0)<Ω(δ)

and the Fourier series of f diverges at 0.

See also

References

  1. Gustafson, Karl E. (1999), Introduction to Partial Differential Equations and Hilbert Space Methods, Courier Dover Publications, p. 121, ISBN 978-0-486-61271-3