Entropic value at risk
In financial mathematics and stochastic optimization, the concept of risk measure is used to quantify the risk involved in a random outcome or risk position. Many risk measures have hitherto been proposed, each having certain characteristics. The entropic value-at-risk (EVaR) is a coherent risk measure introduced by Ahmadi-Javid,[1][2] which is an upper bound for the value at risk (VaR) and the conditional value-at-risk (CVaR), obtained from the Chernoff inequality. The EVaR can also be represented by using the concept of relative entropy. Because of its connection with the VaR and the relative entropy, this risk measure is called "entropic value-at-risk". The EVaR was developed to tackle some computational inefficiencies of the CVaR. Getting inspiration from the dual representation of the EVaR, Ahmadi-Javid[1][2] developed a wide class of coherent risk measures, called g-entropic risk measures. Both the CVaR and the EVaR are members of this class.
Definition
Let be a probability space with
a set of all simple events,
a
-algebra of subsets of
and
a probability measure on
. Let
be a random variable and
be the set of all Borel measurable functions
whose moment-generating function
exists for all
. The entropic value-at-risk (EVaR) of
with confidence level
is defined as follows:
-
(1)
In finance, the random variable , in the above equation, is used to model the losses of a portfolio.
Consider the Chernoff inequality
-
(2)
Solving the equation for
, results in
. By considering the equation (1), we see that
, which shows the relationship between the EVaR and the Chernoff inequality. It is worth noting that
is the entropic risk measure or exponential premium, which is a concept used in finance and insurance, respectively.
Let be the set of all Borel measurable functions
whose moment-generating function
exists for all
. The dual representation (or robust representation) of the EVaR is as follows:
-
(3)
where , and
is a set of probability measures on
with
. Note that
is the relative entropy of
with respect to
, also called the Kullback–Leibler divergence. The dual representation of the EVaR discloses the reason behind its naming.
Properties
- The EVaR is a coherent risk measure.
- The moment-generating function
can be represented by the EVaR: for all
and
-
(4)
- For
,
for all
if and only if
for all
.
- The entropic risk measure with parameter
, can be represented by means of the EVaR: for all
and
-
(5)
- The EVaR with confidence level
is the tightest possible upper bound that can be obtained from the Chernoff inequality for the VaR and the CVaR with confidence level
;
-
(6)
- The following inequality holds for the EVaR:
-
(7)
where is the expected value of
and
is the essential supremum of
, i.e.,
. So do hold
and
.
Examples


For ,
-
(8)
For ,
-
(9)
Figures 1 and 2 show the comparing of the VaR, CVaR and EVaR for and
.
Optimization
Let be a risk measure. Consider the optimization problem
-
(10)
where is an
-dimensional real decision vector,
is an
-dimensional real random vector with a known probability distribution and the function
is a Borel measurable function for all values
. If
is the
, then the problem (10) becomes as follows:
-
(11)
Let be the support of the random vector
. If
is convex for all
, then the objective function of the problem (11) is also convex. If
has the form
-
(12)
and are independent random variables in
, then (11) becomes
-
(13)
which is computationally tractable. But for this case, if one uses the CVaR in problem (10), then the resulting problem becomes as follows:
-
(14)
It can be shown that by increasing the dimension of , problem (14) is computationally intractable even for simple cases. For example, assume that
are independent discrete random variables that take
distinct values. For fixed values of
and
, the complexity of computing the objective function given in problem (13) is of order
while the computing time for the objective function of problem (14) is of order
. For illustration, assume that
,
and the summation of two numbers takes
seconds. For computing the objective function of problem (14) one needs about
years, whereas the evaluation of objective function of problem (13) takes about
seconds. This shows that formulation with the EVaR outperforms the formulation with the CVaR (see [2] for more details).
Generalization (g-entropic risk measures)
Drawing inspiration from the dual representation of the EVaR given in (3), one can define a wide class of information-theoretic coherent risk measures, which are introduced in.[1][2] Let be a convex proper function with
and
be a non-negative number. The
-entropic risk measure with divergence level
is defined as
-
(15)
where in which
is the generalized relative entropy of
with respect to
. A primal representation of the class of
-entropic risk measures can be obtained as follows:
-
(16)
where is the conjugate of
. By considering
-
(17)
with and
, the EVaR formula can be deduced. The CVaR is also a
-entropic risk measure, which can be obtained from (16) by setting
-
(18)
with and
(see [1][3] for more details).
For more results on -entropic risk measures see.[4]
See also
- Stochastic optimization
- Risk measure
- Coherent risk measure
- Value at risk
- Conditional value-at-risk
- Expected shortfall
- Entropic risk measure
- Kullback–Leibler divergence
- Generalized relative entropy
References
- 1 2 3 4 Ahmadi-Javid, Amir (2011). An information-theoretic approach to constructing coherent risk measures. St. Petersburg, Russia: Proceedings of IEEE International Symposium on Information Theory. pp. 2125–2127. doi:10.1109/ISIT.2011.6033932.
- 1 2 3 4 Ahmadi-Javid, Amir (2012). "Entropic value-at-risk: A new coherent risk measure". Journal of Optimization Theory and Applications 155 (3): 1105–1123. doi:10.1007/s10957-011-9968-2.
- ↑ Ahmadi-Javid, Amir (2012). "Addendum to: Entropic Value-at-Risk: A New Coherent Risk Measure". Journal of Optimization Theory and Applications 155 (3): 1124–1128. doi:10.1007/s10957-012-0014-9.
- ↑ Breuer, Thomas; Csiszar, Imre (2013). "Measuring Distribution Model Risk". arXiv:1301.4832v1.