A [distribution](/page/Distribution) is an abstract object — a continuous linear functional on a space of test functions. It cannot be evaluated at a point, integrated against Lebesgue measure, or substituted into classical formulas. Yet the most common distributions in practice *do* come from functions: the distribution $T_f$ associated to a locally integrable function $f$ acts on a test function $\varphi$ by integration, $T_f(\varphi) = \int f\varphi \, d\mathcal{L}^n$. These are the **regular distributions**, and the map $f \mapsto T_f$ is the bridge between classical function theory and the distributional framework.
Understanding this bridge is essential because most of analysis — $L^p$ estimates, Fourier transform formulas, pointwise bounds, Sobolev norms — operates on functions, while the distributional setting provides the generality needed for PDE theory. Every time one writes an expression like $|\hat{u}(\xi)|^2$ for a [tempered distribution](/page/Tempered%20Distributions) $u \in \mathcal{S}'(\mathbb{R}^n)$, one is implicitly asserting that $\hat{u}$ is a regular distribution — represented by an actual measurable function whose absolute value can be computed pointwise. Making this assertion rigorous requires the theory developed on this page.
## Motivation
[motivation]
### Why Functions Are Not Distributions
A measurable function $f: \Omega \to \mathbb{R}$ and a continuous linear functional $T: \mathcal{D}(\Omega) \to \mathbb{R}$ are fundamentally different objects. The function $f$ assigns a real number to each point $x \in \Omega$ (or at least to almost every point); the functional $T$ assigns a real number to each test function $\varphi \in \mathcal{D}(\Omega)$. The function lives on $\Omega$; the functional lives on the infinite-dimensional space $\mathcal{D}(\Omega)$.
Because of this type mismatch, the statement "$f$ is a distribution" is strictly meaningless. What *is* meaningful is the statement "$f$ generates a distribution" — meaning there is a specific distribution $T_f$ that encodes all the information in $f$, constructed by integration against test functions. The question is then: does $T_f$ retain all the information about $f$? If two functions generate the same distribution, must they be equal (almost everywhere)? If so, no information is lost, and we can work with $T_f$ as a faithful proxy for $f$.
### Why Integration Against Test Functions Is the Right Construction
The map $f \mapsto T_f$ defined by $T_f(\varphi) := \int_\Omega f(x)\varphi(x) \, d\mathcal{L}^n(x)$ is not the only way to associate a linear functional to a function. One could try pointwise evaluation: $E_f(\varphi) := f(x_0)$ for some fixed $x_0$. But this depends on the pointwise value of $f$, which is undefined for $L^p$ functions (defined only up to null sets). Or one could try $A_f(\varphi) := \int f \, d\mathcal{L}^n$ (ignoring $\varphi$ entirely), but this loses all local information about $f$.
Integration against test functions is the right choice because it is simultaneously:
- **Well-defined on $L^1_{\mathrm{loc}}(\Omega)$**: The integral $\int_\Omega f\varphi \, d\mathcal{L}^n$ is finite for every $\varphi \in \mathcal{D}(\Omega)$ because $\varphi$ has compact support and $f$ is locally integrable.
- **Null-set insensitive**: If $f = g$ $\mathcal{L}^n$-a.e., then $T_f = T_g$ (integration does not see null sets), so the construction respects the equivalence relation on $L^1_{\mathrm{loc}}$.
- **Faithful**: The map $f \mapsto T_f$ is injective — distinct $L^1_{\mathrm{loc}}$ functions (distinct as equivalence classes) generate distinct distributions, as proved in the [injectivity theorem](/theorems/450) below.
- **Compatible with calculus**: The [distributional derivative](/page/Distributional%20Derivative) of $T_f$ recovers the classical derivative when $f$ is smooth, and extends it to non-smooth functions by integration by parts.
### The Injectivity Problem
Injectivity is not obvious. The question is: if $\int_\Omega f\varphi \, d\mathcal{L}^n = 0$ for every $\varphi \in \mathcal{D}(\Omega)$, must $f = 0$ a.e.? This would fail if $\mathcal{D}(\Omega)$ were too small — for instance, if $\mathcal{D}(\Omega)$ contained only the zero function, every distribution would be zero and injectivity would be vacuous. It would also fail if we used a space of test functions that could not "detect" the values of $f$ at enough points. The proof that it works relies on mollification: if $\rho_\varepsilon$ is a standard [mollifier](/page/Mollifier), the function $y \mapsto \rho_\varepsilon(x - y)$ is a test function (for $x$ away from $\partial\Omega$), so $(f * \rho_\varepsilon)(x) = T_f(\rho_\varepsilon(x - \cdot)) = 0$. Since $f * \rho_\varepsilon \to f$ a.e. by the Lebesgue differentiation theorem, we conclude $f = 0$ a.e.
[/motivation]
## Definition
[definition: Regular Distribution]
Let $\Omega \subseteq \mathbb{R}^n$ be a non-empty open set and let $f \in L^1_{\mathrm{loc}}(\Omega)$. The **regular distribution** generated by $f$ is the distribution $T_f \in \mathcal{D}'(\Omega)$ defined by the map
\begin{align*}
T_f: \mathcal{D}(\Omega) &\to \mathbb{R} \\
\varphi &\mapsto \int_\Omega f(x)\,\varphi(x) \, d\mathcal{L}^n(x).
\end{align*}
[/definition]
That $T_f$ is a distribution requires verification. Linearity follows from linearity of the Lebesgue integral. For continuity, one checks the semi-norm condition from the [characterisation of distributions](/theorems/449): if $\mathrm{supp}(\varphi) \subseteq K$ for a compact $K \subset \Omega$, then
\begin{align*}
|T_f(\varphi)| &\le \int_K |f(x)||\varphi(x)| \, d\mathcal{L}^n(x) \le \left(\int_K |f| \, d\mathcal{L}^n\right) \sup_K |\varphi|.
\end{align*}
The constant $C_K := \int_K |f| \, d\mathcal{L}^n$ is finite because $f \in L^1_{\mathrm{loc}}(\Omega)$ and $K$ is compact. The bound $|T_f(\varphi)| \le C_K \sup_K |\varphi|$ has order $N_K = 0$: the distribution $T_f$ depends only on the values of $\varphi$, not on any of its derivatives. This means every regular distribution has **order zero** — a characterisation that fails for singular distributions (for example, $\delta_0'(\varphi) = -\varphi'(0)$ has order $1$).
[definition: Singular Distribution]
A distribution $T \in \mathcal{D}'(\Omega)$ is **singular** if there is no $f \in L^1_{\mathrm{loc}}(\Omega)$ such that $T = T_f$.
[/definition]
[example: The Dirac Delta Is Singular]
For $x_0 \in \Omega$, the Dirac delta $\delta_{x_0}(\varphi) := \varphi(x_0)$ is a distribution. It is singular: if $\delta_{x_0} = T_f$ for some $f \in L^1_{\mathrm{loc}}(\Omega)$, then $\int_\Omega f\varphi \, d\mathcal{L}^n = \varphi(x_0)$ for all $\varphi \in \mathcal{D}(\Omega)$. Choose a sequence $\varphi_k \in \mathcal{D}(\Omega)$ with $0 \le \varphi_k \le 1$, $\varphi_k(x_0) = 1$, and $\mathrm{supp}(\varphi_k) \subseteq B(x_0, 1/k)$. The left side satisfies $|\int f\varphi_k \, d\mathcal{L}^n| \le \int_{B(x_0,1/k)} |f| \, d\mathcal{L}^n \to 0$ as $k \to \infty$ (since $f \in L^1_{\mathrm{loc}}$ and the measure of $B(x_0, 1/k)$ tends to zero). But the right side is $\varphi_k(x_0) = 1$ for every $k$. Contradiction.
[/example]
## Injectivity of the Canonical Embedding
The most important structural property of the map $f \mapsto T_f$ is that it is injective: distinct $L^1_{\mathrm{loc}}$ functions generate distinct distributions.
[quotetheorem:450]
The injectivity has several important consequences. First, it justifies the *identification* of $f$ with $T_f$: since the map is injective, we can (and do) write $f \in \mathcal{D}'(\Omega)$ as shorthand for $T_f \in \mathcal{D}'(\Omega)$, and no ambiguity arises. Second, it gives a precise meaning to the question "is the distribution $T$ a function?": this means "does there exist $f \in L^1_{\mathrm{loc}}(\Omega)$ with $T = T_f$?", and if so, $f$ is unique up to null sets.
Third, it ensures that operations defined distributionally are consistent with classical operations. If $f$ is smooth, the distributional derivative $\partial^\alpha T_f$ agrees with the classical derivative: $\partial^\alpha T_f = T_{\partial^\alpha f}$, because integration by parts gives $(\partial^\alpha T_f)(\varphi) = (-1)^{|\alpha|}T_f(\partial^\alpha \varphi) = (-1)^{|\alpha|}\int f \, \partial^\alpha\varphi \, d\mathcal{L}^n = \int (\partial^\alpha f)\varphi \, d\mathcal{L}^n = T_{\partial^\alpha f}(\varphi)$ (the boundary terms vanish because $\varphi$ has compact support). Without injectivity, the equality $\partial^\alpha T_f = T_{\partial^\alpha f}$ would not pin down $\partial^\alpha f$ uniquely.
[example: When Two Functions Generate the Same Distribution]
Let $f(x) = x$ and $g(x) = x$ for $x \ne 0$, $g(0) = 7$. Then $f = g$ $\mathcal{L}^1$-a.e. on $\mathbb{R}$ (they differ only at the single point $0$, which has measure zero), so $T_f = T_g$ in $\mathcal{D}'(\mathbb{R})$. This is consistent with injectivity: $f$ and $g$ represent the same element of $L^1_{\mathrm{loc}}(\mathbb{R})$, so they are not "distinct" in the relevant sense. The distribution $T_f$ cannot see pointwise values on null sets — only the equivalence class of $f$ in $L^1_{\mathrm{loc}}$.
[/example]
## Regular Tempered Distributions
The same construction works in the [tempered distribution](/page/Tempered%20Distributions) setting, with a stronger integrability condition. For $f \in L^1_{\mathrm{loc}}(\mathbb{R}^n)$, the functional $T_f(\varphi) = \int f\varphi \, d\mathcal{L}^n$ is a distribution on $\mathcal{D}(\mathbb{R}^n)$, but it may not extend continuously to $\mathcal{S}(\mathbb{R}^n)$ — because Schwartz functions have rapid decay but not compact support, and $f$ might grow too fast at infinity for the integral to converge.
The condition for $T_f$ to define a tempered distribution is that $f$ must not grow faster than a polynomial at infinity. Specifically, $T_f \in \mathcal{S}'(\mathbb{R}^n)$ whenever $f$ satisfies
\begin{align*}
\int_{\mathbb{R}^n} |f(x)|(1+|x|)^{-N} \, d\mathcal{L}^n(x) &< \infty
\end{align*}
for some $N \ge 0$. This holds for all $f \in L^p(\mathbb{R}^n)$ with $1 \le p \le \infty$ (by Hölder's inequality, as shown on the [Tempered Distributions](/page/Tempered%20Distributions) page), and for all measurable $f$ with $|f(x)| \le C(1+|x|)^M$.
The injectivity of $f \mapsto T_f$ in $\mathcal{S}'(\mathbb{R}^n)$ follows from the $\mathcal{D}'(\mathbb{R}^n)$ result: since $\mathcal{D}(\mathbb{R}^n) \subseteq \mathcal{S}(\mathbb{R}^n)$, the condition $T_f = T_g$ in $\mathcal{S}'(\mathbb{R}^n)$ implies $T_f = T_g$ in $\mathcal{D}'(\mathbb{R}^n)$ (by restricting to test functions with compact support), which by the [injectivity theorem](/theorems/450) gives $f = g$ a.e.
[example: A Regular Distribution That Is Not Tempered]
The function $g(x) = e^{x^2}$ on $\mathbb{R}$ is locally integrable (continuous functions are locally integrable), so $T_g \in \mathcal{D}'(\mathbb{R})$ is a well-defined regular distribution. However, $T_g \notin \mathcal{S}'(\mathbb{R})$: for any $N \ge 0$, the integral $\int_\mathbb{R} e^{x^2}(1+|x|)^{-N} \, d\mathcal{L}^1(x) = +\infty$ because $e^{x^2}$ grows faster than any polynomial. This function defines a distribution on $\mathcal{D}(\mathbb{R})$ (compact support kills the growth) but not on $\mathcal{S}(\mathbb{R})$ (polynomial decay cannot compensate). A detailed proof is given in Problem 2 on the [Tempered Distributions](/page/Tempered%20Distributions) page.
[/example]
## When Is a Distribution Regular?
Given a distribution $u \in \mathcal{D}'(\Omega)$, the question "is $u$ a regular distribution?" asks whether there exists $f \in L^1_{\mathrm{loc}}(\Omega)$ with $u = T_f$. This is often the central question in PDE theory: showing that a distributional solution is actually a function (with some regularity) is the content of most regularity theorems.
There is no simple characterisation of which distributions are regular in general. However, several important sufficient conditions exist:
**Order zero.** Every distribution of order zero on $\Omega$ is a signed Radon measure (by the Riesz representation theorem), but not necessarily a regular distribution — point masses have order zero but are singular. However, if a distribution of order zero is also absolutely continuous with respect to $\mathcal{L}^n$, the Radon–Nikodym theorem gives $u = T_f$ for some $f \in L^1_{\mathrm{loc}}$.
**Convolution with test functions.** For any $u \in \mathcal{D}'(\mathbb{R}^n)$ (or $u \in \mathcal{S}'(\mathbb{R}^n)$) and any test function $\varphi$, the convolution $u * \varphi$ is a smooth function (with at most polynomial growth in the tempered case). This is a fundamental regularisation mechanism: convolving with a test function always produces a regular distribution.
**The Fourier transform and $L^2$ membership.** In the tempered setting, a distribution $u \in \mathcal{S}'(\mathbb{R}^n)$ satisfies $u \in L^2(\mathbb{R}^n)$ (meaning $u = T_f$ for some $f \in L^2$) if and only if $\hat{u} \in L^2(\mathbb{R}^n)$. This follows from the [Plancherel theorem](/theorems/247): the Fourier transform is a unitary isomorphism of $L^2$, so $u$ is represented by an $L^2$ function precisely when $\hat{u}$ is.
## The Fourier Transform and Regularity
The interaction between regularity (being represented by a function) and the Fourier transform is central to modern PDE theory and is the mechanism underlying the [Sobolev space](/page/Sobolev%20Space) theory.
For a tempered distribution $u \in \mathcal{S}'(\mathbb{R}^n)$, the Fourier transform $\hat{u}$ is another tempered distribution. In general, $\hat{u}$ may or may not be regular — even if $u$ is regular, $\hat{u}$ could be singular (and conversely). The following cases illustrate the possibilities:
[example: Fourier Transform of $L^1$ Functions]
If $f \in L^1(\mathbb{R}^n)$, the Fourier transform $\hat{f}(\xi) = \int_{\mathbb{R}^n} f(x)e^{-i\xi \cdot x} \, d\mathcal{L}^n(x)$ is a bounded continuous function (dominated convergence gives continuity, and $|\hat{f}(\xi)| \le \|f\|_{L^1}$ gives boundedness). So $T_f$ is a regular tempered distribution whose Fourier transform $\widehat{T_f} = T_{\hat{f}}$ is also regular.
However, $\hat{f}$ need not be in $L^1$: the [Fourier transform](/page/Fourier%20Transform) page shows that $\hat{f}$ for $f = \mathbb{1}_{[-1,1]}$ decays like $|\xi|^{-1}$, which is not integrable. This is why $L^1$ alone is not closed under the Fourier transform and why the [Schwartz space](/page/Schwartz%20Space) is needed.
[/example]
[example: Regular Distribution With Singular Fourier Transform]
The constant function $f(x) = 1$ belongs to $L^\infty(\mathbb{R}^n)$ and defines a regular tempered distribution $T_1$. Its Fourier transform is $\hat{T}_1 = (2\pi)^n\delta_0$, which is singular — there is no locally integrable function $g$ with $T_g = (2\pi)^n\delta_0$. So $T_1$ is regular but $\widehat{T_1}$ is singular.
[/example]
[example: Singular Distribution With Regular Fourier Transform]
The Dirac delta $\delta_0$ is singular. Its Fourier transform is $\hat{\delta}_0 = T_1$ (the constant function $1$), which is regular. So $\delta_0$ is singular but $\hat{\delta}_0$ is regular. This is the distributional dual of the previous example.
[/example]
These examples show that regularity is not preserved by the Fourier transform in general. The one setting where regularity is preserved is $L^2$: the [Plancherel theorem](/theorems/247) guarantees that $u = T_f$ with $f \in L^2$ if and only if $\hat{u} = T_g$ with $g \in L^2$ and $\|g\|_{L^2} = (2\pi)^{n/2}\|f\|_{L^2}$.
This dichotomy — Fourier regularity sometimes preserved, sometimes not — is precisely what makes the [inhomogeneous Sobolev space](/page/Inhomogeneous%20Sobolev%20Spaces) definition non-trivial. The condition $(1+|\xi|^2)^{s/2}\hat{u} \in L^2(\mathbb{R}^n)$ asks that the Fourier transform of $u$, after reweighting, becomes a regular distribution represented by an $L^2$ function. As explained on the [Inhomogeneous Sobolev Spaces](/page/Inhomogeneous%20Sobolev%20Spaces) page, the [well-definedness theorem](/theorems/466) establishes that this condition is internally consistent: it produces a well-defined norm and a complete Hilbert space.
## The Embedding Chain
The map $f \mapsto T_f$ is the middle arrow in the chain of continuous embeddings
\begin{align*}
\mathcal{D}(\mathbb{R}^n) \hookrightarrow \mathcal{S}(\mathbb{R}^n) \hookrightarrow L^p(\mathbb{R}^n) \xrightarrow{\; f \,\mapsto\, T_f \;} \mathcal{S}'(\mathbb{R}^n) \hookrightarrow \mathcal{D}'(\mathbb{R}^n),
\end{align*}
described in detail on the [Distribution](/page/Distribution) page. The first two arrows are set-theoretic inclusions: every test function is a Schwartz function (compact support implies rapid decay), and every Schwartz function is in $L^p$ (rapid decay implies integrability). The third arrow is the canonical embedding discussed on this page — linear, injective, continuous, but *not* a set-theoretic inclusion (it changes the type of the object from a function to a functional). The fourth arrow is restriction: every functional on $\mathcal{S}$ restricts to a functional on the subspace $\mathcal{D} \subseteq \mathcal{S}$.
At the left end of the chain, every object is a classical function: it can be evaluated pointwise, its absolute value $|f(x)|$ is defined everywhere, and its Fourier transform is another Schwartz function. As one moves rightward, pointwise information is progressively lost. In $L^p$, functions are defined only up to null sets; the expression $|f(x_0)|$ depends on the choice of representative. In $\mathcal{S}'$ and $\mathcal{D}'$, the objects may not be functions at all, and expressions like $|u(x)|$ or $|\hat{u}(\xi)|^2$ are undefined until one proves that $u$ (or $\hat{u}$) is a regular distribution.
The critical distinction is between the left half of the chain (where objects are functions) and the right half (where objects are functionals). The map $f \mapsto T_f$ is the crossing point: it is the construction that embeds the world of functions into the world of distributions, faithfully and injectively, so that the two frameworks can interact.
## References
1. L. Hörmander, *The Analysis of Linear Partial Differential Operators I* (1983).
2. W. Rudin, *Functional Analysis* (1991).
3. L. Grafakos, *Classical Fourier Analysis*, 3rd ed. (2014).
4. G. B. Folland, *Introduction to Partial Differential Equations* (1995).
5. F. G. Friedlander and M. Joshi, *Introduction to the Theory of Distributions* (1998).