Differential operator

A harmonic function defined on an annulus. Harmonic functions are exactly those functions which lie in the kernel of the Laplace operator, an important differential operator.

In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function (in the style of a higher-order function in computer science).

This article considers mainly linear operators, which are the most common type. However, non-linear differential operators, such as the Schwarzian derivative also exist.

Definition

Assume that there is a map A from a function space \mathcal{F}_1 to another function space \mathcal{F}_2 and a function f \in \mathcal{F}_2 so that f is the image of u \in \mathcal{F}_1 i.e., f=A(u)\ . A differential operator is represented as a linear combination, finitely generated by u and its derivatives containing higher degree such as

P(x,D)=\sum_{|\alpha|\le m}a_\alpha(x) D^\alpha\ ,

where the set of non-negative integers, \alpha=(\alpha_1,\alpha_2,\cdots,\alpha_n) , is called a multi-index, |\alpha|=\alpha_1+\alpha_2+\cdots+\alpha_n called length, a_\alpha(x) are functions on some open domain in n-dimensional space and D^\alpha=D^{\alpha_1} D^{\alpha_2} \cdots D^{\alpha_n}\ . The derivative above is one as functions or, sometimes, distributions or hyperfunctions and D_j=-i\frac{\partial}{\partial x_j} or sometimes, D_j=\frac{\partial}{\partial x_j} .

Notations

The most common differential operator is the action of taking the derivative itself. Common notations for taking the first derivative with respect to a variable x include:

{d \over dx},  D,\,  D_x,\, and \partial_x.

When taking higher, nth order derivatives, the operator may also be written:

{d^n \over dx^n}, D^n\,, or D^n_x.\,

The derivative of a function f of an argument x is sometimes given as either of the following:

[f(x)]'\,\!
f'(x).\,\!

The D notation's use and creation is credited to Oliver Heaviside, who considered differential operators of the form

\sum_{k=0}^n c_k D^k

in his study of differential equations.

One of the most frequently seen differential operators is the Laplacian operator, defined by

\Delta=\nabla^{2}=\sum_{k=1}^n {\partial^2\over \partial x_k^2}.

Another differential operator is the Θ operator, or theta operator, defined by[1]

\Theta = z {d \over dz}.

This is sometimes also called the homogeneity operator, because its eigenfunctions are the monomials in z:

\Theta (z^k) = k z^k,\quad k=0,1,2,\dots

In n variables the homogeneity operator is given by

\Theta = \sum_{k=1}^n x_k \frac{\partial}{\partial x_k}.

As in one variable, the eigenspaces of Θ are the spaces of homogeneous polynomials.

In writing, following common mathematical convention, the argument of a differential operator is usually placed on the right side of the operator itself. Sometimes an alternative notation is used: The result of applying the operator to the function on the left side of the operator and on the right side of the operator, and the difference obtained when applying the differential operator to the functions on both sides, are denoted by arrows as follows:

f \overleftarrow{\partial_x} g = g \cdot \partial_x f
f \overrightarrow{\partial_x} g = f \cdot \partial_x g
f \overleftrightarrow{\partial_x} g = f \cdot \partial_x g - g \cdot \partial_x f.

Such a bidirectional-arrow notation is frequently used for describing the probability current of quantum mechanics.

Del

Main article: Del

The differential operator del, also called nabla operator, is an important vector differential operator. It appears frequently in physics in places like the differential form of Maxwell's equations. In three-dimensional Cartesian coordinates, del is defined:

\nabla = \mathbf{\hat{x}} {\partial \over \partial x}  + \mathbf{\hat{y}} {\partial \over \partial y} + \mathbf{\hat{z}} {\partial \over \partial z}.

Del is used to calculate the gradient, curl, divergence, and Laplacian of various objects.

Adjoint of an operator

Given a linear differential operator T

Tu = \sum_{k=0}^n a_k(x) D^k u

the adjoint of this operator is defined as the operator T^* such that

\langle Tu,v \rangle = \langle u, T^*v \rangle

where the notation \langle\cdot,\cdot\rangle is used for the scalar product or inner product. This definition therefore depends on the definition of the scalar product.

Formal adjoint in one variable

In the functional space of square-integrable functions, the scalar product is defined by

\langle f, g \rangle = \int_a^b f(x) \, \overline{g(x)} \,dx ,

where the line over g(x) denotes the complex conjugate of g(x). If one moreover adds the condition that f or g vanishes for x \to a and x \to b, one can also define the adjoint of T by

T^*u = \sum_{k=0}^n (-1)^k D^k [\overline{a_k(x)}u].\,

This formula does not explicitly depend on the definition of the scalar product. It is therefore sometimes chosen as a definition of the adjoint operator. When T^* is defined according to this formula, it is called the formal adjoint of T.

A (formally) self-adjoint operator is an operator equal to its own (formal) adjoint.

Several variables

If Ω is a domain in Rn, and P a differential operator on Ω, then the adjoint of P is defined in L2(Ω) by duality in the analogous manner:

\langle f, P^* g\rangle_{L^2(\Omega)} = \langle P f, g\rangle_{L^2(\Omega)}

for all smooth L2 functions f, g. Since smooth functions are dense in L2, this defines the adjoint on a dense subset of L2: P* is a densely defined operator.

Example

The SturmLiouville operator is a well-known example of a formal self-adjoint operator. This second-order linear differential operator L can be written in the form

Lu = -(pu')'+qu=-(pu''+p'u')+qu=-pu''-p'u'+qu=(-p) D^2 u +(-p') D u + (q)u.\;\!

This property can be proven using the formal adjoint definition above.

\begin{align}
L^*u & {} = (-1)^2 D^2 [(-p)u] + (-1)^1 D [(-p')u] + (-1)^0 (qu) \\
 & {} = -D^2(pu) + D(p'u)+qu \\
 & {} = -(pu)''+(p'u)'+qu \\
 & {} = -p''u-2p'u'-pu''+p''u+p'u'+qu \\
 & {} = -p'u'-pu''+qu \\
 & {} = -(pu')'+qu \\
 & {} = Lu
\end{align}

This operator is central to Sturm–Liouville theory where the eigenfunctions (analogues to eigenvectors) of this operator are considered.

Properties of differential operators

Differentiation is linear, i.e.,

D(f+g) = (Df)+(Dg)\,
D(af) = a(Df)\,

where f and g are functions, and a is a constant.

Any polynomial in D with function coefficients is also a differential operator. We may also compose differential operators by the rule

(D_1 \circ D_2)(f) = D_1(D_2(f)).\,

Some care is then required: firstly any function coefficients in the operator D2 must be differentiable as many times as the application of D1 requires. To get a ring of such operators we must assume derivatives of all orders of the coefficients used. Secondly, this ring will not be commutative: an operator gD isn't the same in general as Dg. In fact we have for example the relation basic in quantum mechanics:

Dx - xD = 1.\,

The subring of operators that are polynomials in D with constant coefficients is, by contrast, commutative. It can be characterised another way: it consists of the translation-invariant operators.

The differential operators also obey the shift theorem.

Several variables

The same constructions can be carried out with partial derivatives, differentiation with respect to different variables giving rise to operators that commute (see symmetry of second derivatives).

Ring of polynomial differential operators

Ring of univariate polynomial differential operators

Main article: Weyl algebra

If R is a ring, let R\langle D,X \rangle be the non-commutative polynomial ring over R in the variable D and X, and I the two-sided ideal generated by DX-XD-1, then the ring of univariate polynomial differential operators over R is the quotient ring R\langle D,X\rangle/I. This is a non-commutative simple ring. Every elements can be written in a unique way as a R-linear combination of monomials of the form X^a D^b \mod{I}. It supports an analogue of the Euclidean division of polynomials.

Differential modules over R[X] (for the standard derivation) can be identified with modules over R\langle D,X\rangle/I.

Ring of multivariate polynomial differential operators

If R is a ring, let R\langle D_1,\ldots,D_n,X_1,\ldots,X_n\rangle be the non-commutative polynomial ring over R in the variables D_1,\ldots,D_n,X_1,\ldots,X_n, and I the two-sided ideal generated by the elements D_i X_j-X_j D_i-\delta_{i,j}, D_i D_j -D_j D_i, X_i X_j - X_j X_i for all 1\le i,j\le n where \delta is Kronecker delta, then the ring of multivariate polynomial differential operators over R is the quotient ring R\langle D_1,\ldots,D_n,X_1,\ldots,X_n\rangle/I.

This is a non-commutative simple ring. Every elements can be written in a unique way as a R-linear combination of monomials of the form X_1^{a_1}\ldots X_n^{a_n} D_1^{b_1}\ldots D_n^{b_n}.

Coordinate-independent description

In differential geometry and algebraic geometry it is often convenient to have a coordinate-independent description of differential operators between two vector bundles. Let E and F be two vector bundles over a differentiable manifold M. An R-linear mapping of sections P : Γ(E) Γ(F) is said to be a kth-order linear differential operator if it factors through the jet bundle Jk(E). In other words, there exists a linear mapping of vector bundles

i_P: J^k(E) \rightarrow F\,

such that

P = i_P\circ j^k

where jk: Γ(E) Γ(Jk(E)) is the prolongation that associates to any section of E its k-jet.

This just means that for a given section s of E, the value of P(s) at a point x  M is fully determined by the kth-order infinitesimal behavior of s in x. In particular this implies that P(s)(x) is determined by the germ of s in x, which is expressed by saying that differential operators are local. A foundational result is the Peetre theorem showing that the converse is also true: any (linear) local operator is differential.

Relation to commutative algebra

An equivalent, but purely algebraic description of linear differential operators is as follows: an R-linear map P is a kth-order linear differential operator, if for any k + 1 smooth functions f_0,\ldots,f_k \in C^\infty(M) we have

[f_k,[f_{k-1},[\cdots[f_0,P]\cdots]]=0.

Here the bracket [f,P]:\Gamma(E)\rightarrow \Gamma(F) is defined as the commutator

[f,P](s)=P(f\cdot s)-f\cdot P(s).\,

This characterization of linear differential operators shows that they are particular mappings between modules over a commutative algebra, allowing the concept to be seen as a part of commutative algebra.

Examples

 \frac{\partial}{\partial z} = \frac{1}{2} \left( \frac{\partial}{\partial x} - i \frac{\partial}{\partial y} \right) \quad,\quad \frac{\partial}{\partial\bar{z}}= \frac{1}{2} \left( \frac{\partial}{\partial x} + i \frac{\partial}{\partial y} \right) \ .

This approach is also used to study functions of several complex variables and functions of a motor variable.

History

The conceptual step of writing a differential operator as something free-standing is attributed to Louis François Antoine Arbogast in 1800.[2]

See also

References

  1. E. W. Weisstein. "Theta Operator". Retrieved 2009-06-12.
  2. James Gasser (editor), A Boole Anthology: Recent and classical studies in the logic of George Boole (2000), p. 169; Google Books.

External links

This article is issued from Wikipedia - version of the Tuesday, May 03, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.