Convex conjugate

In mathematics, convex conjugation is a generalization of the Legendre transformation. It is also known as Legendre–Fenchel transformation or Fenchel transformation (after Adrien-Marie Legendre and Werner Fenchel).

Definition

Let X be a real normed vector space, and let X^{*} be the dual space to X. Denote the dual pairing by

\langle \cdot , \cdot \rangle : X^{*} \times X \to \mathbb{R}.

For a functional

f : X \to \mathbb{R} \cup \{ + \infty \}

taking values on the extended real number line, the convex conjugate

f^\star : X^{*} \to \mathbb{R} \cup \{ + \infty \}

is defined in terms of the supremum by

f^{\star} \left( x^{*} \right) := \sup \left \{ \left. \left\langle x^{*} , x \right\rangle - f \left( x \right) \right| x \in X \right\},

or, equivalently, in terms of the infimum by

f^{\star} \left( x^{*} \right) := - \inf \left \{ \left. f \left( x \right) - \left\langle x^{*} , x \right\rangle \right| x \in X \right\}.

This definition can be interpreted as an encoding of the convex hull of the function's epigraph in terms of its supporting hyperplanes.[1] [2]

Examples

The convex conjugate of an affine function


f(x) = \left\langle a,x \right\rangle - b,\,
a \in \mathbb{R}^n, b \in \mathbb{R}

is


f^\star\left(x^{*} \right)
= \begin{cases} b,      & x^{*}   =  a
             \\ +\infty, & x^{*}  \ne a.
  \end{cases}

The convex conjugate of a power function


f(x) = \frac{1}{p}|x|^p,\,1<p<\infty

is


f^\star\left(x^{*} \right)
= \frac{1}{q}|x^{*}|^q,\,1<q<\infty

where \tfrac{1}{p} + \tfrac{1}{q} = 1.

The convex conjugate of the absolute value function

f(x) = \left| x \right|

is


f^\star\left(x^{*} \right)
= \begin{cases} 0,      & \left|x^{*} \right| \le 1
             \\ \infty, & \left|x^{*} \right|  >  1.
  \end{cases}

The convex conjugate of the exponential function f(x)=\,\! e^x is


f^\star\left(x^{*} \right)
= \begin{cases} x^{*} \ln x^{*} - x^{*}      , & x^{*}  > 0
             \\ 0                            , & x^{*}  = 0
             \\ \infty                       , & x^{*}  < 0.
  \end{cases}

Convex conjugate and Legendre transform of the exponential function agree except that the domain of the convex conjugate is strictly larger as the Legendre transform is only defined for positive real numbers.

Connection with expected shortfall (average value at risk)

Let F denote a cumulative distribution function of a random variable X. Then (integrating by parts),

f(x):= \int_{-\infty}^x F(u)\,du = \operatorname{E}\left[\max(0,x-X)\right] = x-\operatorname{E} \left[\min(x,X)\right]

has the convex conjugate


f^\star(p)=\int_0^p F^{-1}(q) \, dq = (p-1)F^{-1}(p)+\operatorname{E}\left[\min(F^{-1}(p),X)\right] 
 = p F^{-1}(p)-\operatorname{E}\left[\max(0,F^{-1}(p)-X)\right].

Ordering

A particular interpretation has the transform

f^\text{inc}(x):= \arg \sup_t \,t\cdot x-\int_0^1 \max\{t-f(u),0\} \, \mathrm d u,

as this is a nondecreasing rearrangement of the initial function f; in particular, f^\text{inc}= f for ƒ nondecreasing.

Properties

The convex conjugate of a closed convex function is again a closed convex function. The convex conjugate of a polyhedral convex function (a convex function with polyhedral epigraph) is again a polyhedral convex function.

Order reversing

Convex-conjugation is order-reversing: if f \le g then f^* \ge g^*. Here

 (f \le g ) :\iff (\forall x, f(x) \le g(x)).

For a family of functions \left(f_\alpha\right)_\alpha it follows from the fact that supremums may be interchanged that

\left(\inf_\alpha f_\alpha\right)^*(x^*)= \sup_\alpha f_\alpha^*(x^*),

and from the max–min inequality that

\left(\sup_\alpha f_\alpha\right)^*(x^*)\le \inf_\alpha f_\alpha^*(x^*).

Biconjugate

The convex conjugate of a function is always lower semi-continuous. The biconjugate f^{**} (the convex conjugate of the convex conjugate) is also the closed convex hull, i.e. the largest lower semi-continuous convex function with f^{**}\le f. For proper functions f,

 f = f^{**} if and only if f is convex and lower semi-continuous by Fenchel–Moreau theorem.

Fenchel's inequality

For any function f and its convex conjugate f *, Fenchel's inequality (also known as the Fenchel–Young inequality) holds for every xX and pX * :


\left\langle p,x \right\rangle \le f(x) + f^*(p).

Convexity

For two functions f_0 and f_1 and a number 0\le \lambda\le 1 the convexity relation

\left((1-\lambda)f_0+\lambda f_1\right)^\star\le (1-\lambda)f_0^\star+ \lambda f_1^\star

holds. The \star operation is a convex mapping itself.

Infimal convolution

The infimal convolution (or epi-sum) of two functions f and g is defined as

 \left(f \Box  g\right)(x) = \inf \left \{ f(x-y) + g(y) \, | \, y \in \mathbb{R}^n \right \}.

Let f1, …, fm be proper, convex and lsc functions on Rn. Then the infimal convolution is convex and lsc (but not necessarily proper),[3] and satisfies

 \left( f_1 \Box \cdots \Box f_m \right)^\star = f_1^\star + \cdots + f_m^\star.

The infimal convolution of two functions has a geometric interpretation: The (strict) epigraph of the infimal convolution of two functions is the Minkowski sum of the (strict) epigraphs of those functions.[4]

Maximizing argument

If the function f is differentiable, then its derivative is the maximizing argument in the computation of the convex conjugate:

f^\prime(x) = x^*(x):= \arg\sup_{x^\star} {\langle x, x^\star\rangle} -f^\star(x^\star) and
f^{\star\prime}(x^\star) = x(x^\star):= \arg\sup_x {\langle x, x^\star\rangle}-f(x);

whence

x= \nabla f^{\star}(\nabla f(x)),
x^\star= \nabla f(\nabla f^{\star}(x^\star)),

and moreover

f^{\prime\prime}(x) \cdot f^{\star\prime\prime} (x^\star(x))=1,
f^{\star\prime\prime}(x^\star)\cdot f^{\prime\prime}(x(x^\star))=1.

Scaling properties

If, for some \gamma>0, \,g(x)=\alpha+ \beta x +\gamma \cdot f(\lambda x+\delta), then

g^\star(x^\star)= -\alpha- \delta\frac{x^\star-\beta}\lambda +\gamma \cdot f^\star \left(\frac {x^\star-\beta}{\lambda \gamma}\right).

In case of an additional parameter (α, say) moreover

f_\alpha(x)=-f_\alpha(\tilde x),

where \tilde x is chosen to be the maximizing argument.

Behavior under linear transformations

Let A be a bounded linear operator from X to Y. For any convex function f on X, one has

 \left(A f\right)^\star = f^\star A^\star

where

 (A f)(y) = \inf\{ f(x) : x \in X , A x = y \}

is the preimage of f w.r.t. A and A* is the adjoint operator of A.[5]

A closed convex function f is symmetric with respect to a given set G of orthogonal linear transformations,

f\left(A x\right) = f(x), \; \forall x, \; \forall A \in G

if and only if its convex conjugate f* is symmetric with respect to G.

Table of selected convex conjugates

The following table provides Legendre transforms for many common functions as well as a few useful properties.[6]

g(x) \operatorname{dom}(g) g^*(x^*) \operatorname{dom}(g^*)
f(ax) (where a \neq 0) X f^*\left(\frac{x^*}{a}\right) X^*
f(x + b) X f^*(x^*) - \langle b,x^* \rangle X^*
a f(x) (where a > 0) X a f^*\left(\frac{x^*}{a}\right) X^*
\alpha+ \beta x+ \gamma \cdot f(\lambda x+\delta) X -\alpha- \delta\frac{x^*-\beta}\lambda+ \gamma \cdot f^* \left(\frac {x^*-\beta}{\gamma \lambda}\right)\quad (\gamma>0) X^*
\frac{|x|^p}{p} (where p > 1) \mathbb{R} \frac{|x^*|^q}{q} (where \frac{1}{p} + \frac{1}{q} = 1) \mathbb{R}
\frac{-x^p}{p} (where 0 < p < 1) \mathbb{R}_+ \frac{-(-x^*)^q}q (where \frac 1 p + \frac 1 q = 1) \mathbb{R}_{--}
\sqrt{1 + x^2} \mathbb{R} -\sqrt{1 - (x^*)^2} [-1,1]
-\log(x) \mathbb{R}_{++} -(1 + \log(-x^*)) \mathbb{R}_{--}
e^x \mathbb{R} \begin{cases}x^* \log(x^*) - x^* & \text{if }x^* > 0\\ 0 & \text{if }x^* = 0\end{cases} \mathbb{R}_{+}
\log\left(1 + e^x\right) \mathbb{R} \begin{cases}x^* \log(x^*) + (1 - x^*) \log(1 - x^*) & \text{if }0 < x^* < 1\\ 0 & \text{if }x^* = 0,1\end{cases} [0,1]
-\log\left(1 - e^x\right) \mathbb{R} \begin{cases}x^* \log(x^*) - (1 + x^*) \log(1 + x^*) & \text{if }x^* > 0\\ 0 & \text{if }x^* = 0\end{cases} \mathbb{R}_+

See also

References

  1. "Legendre Transform". Retrieved September 13, 2012.
  2. Nielsen, Frank. "Legendre transformation and information geometry" (PDF).
  3. Phelps, Robert (1991). Convex Functions, Monotone Operators and Differentiability (2 ed.). Springer. p. 42. ISBN 0-387-56715-1.
  4. Bauschke, Heinz H.; Goebel, Rafal; Lucet, Yves; Wang, Xianfu (2008). "The Proximal Average: Basic Theory". SIAM Journal on Optimization 19 (2): 766. doi:10.1137/070687542.
  5. Ioffe, A.D. and Tichomirov, V.M. (1979), Theorie der Extremalaufgaben. Deutscher Verlag der Wissenschaften. Satz 3.4.3
  6. Borwein, Jonathan; Lewis, Adrian (2006). Convex Analysis and Nonlinear Optimization: Theory and Examples (2 ed.). Springer. pp. 50–51. ISBN 978-0-387-29570-1.

External links

This article is issued from Wikipedia - version of the Friday, March 25, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.