Inverted Dirichlet distribution

In statistics, the inverted Dirichlet distribution is a multivariate generalization of the beta prime distribution, and is related to the Dirichlet distribution. It was first described by Tiao and Cuttman in 1965.[1]

The distribution has a density function given by


p\left(x_1,\ldots, x_k\right) = \frac{\Gamma\left(\nu_1+\cdots+\nu_{k+1}\right)}{\prod_{j=1}^{k+1}\Gamma\left(\nu_j\right)}
x_1^{\nu_1-1}\cdots x_k^{\nu_k-1}\times\left(1+\sum_{i=1}^k x_i\right)^{-\sum_{j=1}^{k+1}\nu_j},\qquad x_i>0.

The distribution has applications in statistical regression and arises naturally when considering the multivariate Student distribution. It can be characterized[2] by its moment generating function:


E\left[\prod_{i=1}^kx_i^{q_i}\right] = \frac{\Gamma\left(\nu_{k+1}-\sum_{j=1}^k\nu_j\right)}{\Gamma\left(\nu_{n+1}\right)}\prod_{j=1}^k\frac{\Gamma\left(\nu_j+q_j\right)}{\Gamma\left(\nu_j\right)}

provided that q_j>-\nu_j, 1\leqslant j\leqslant k and \nu_{n+1}>q_1+\ldots+q_k.

The inverted Dirichlet distribution is conjugate to the negative multinomial distribution if a generalized form of odds ratio is used instead of the categories' probabilities.

T. Bdiri et al. have developed several models that use the inverted Dirichlet distribution to represent and model non-Gaussian data. They have introduced finite [3][4] and infinite [5] mixture models of inverted Dirichlet distributions using the Newton–Raphson technique to estimate the parameters and the Dirichlet process to model infinite mixtures. T. Bdiri et al. have also used the inverted Dirichlet distribution to propose an approach to generate Support Vector Machine kernels [6] basing on Bayesian inference and another approach to establish hierarchical clustering.[7][8]

References

  1. Tiao, George T. (1965). "The inverted Dirichlet distribution with applications". Journal of the American Statistical Association 60 (311): 793805. doi:10.1080/01621459.1965.10480828.
  2. Ghorbel, M. (2010). "On the inverted Dirichlet distribution". Communications in Statistics---Theory and Methods 39: 2137. doi:10.1080/03610920802627062.
  3. Bdiri, Taoufik; Nizar, Bouguila (2012). "Positive vectors clustering using inverted Dirichlet finite mixture models". Expert Systems with Applications 39: 1869–1882. doi:10.1016/j.eswa.2011.08.063.
  4. Bdiri, Taoufik; Bouguila, Nizar (2011). "Learning Inverted Dirichlet Mixtures for Positive Data clustering". Lecture Notes in Computer Science: 265–272. doi:10.1007/978-3-642-21881-1_42.
  5. Bdiri, Taoufik; Bouguila, Nizar (2011). "An Infinite Mixture of Inverted Dirichlet Distributions". Neural Information Processing 7063: 71–78. doi:10.1007/978-3-642-24958-7_9.
  6. Bdiri, Taoufik; Nizar, Bouguila (2013). "Bayesian learning of inverted Dirichlet mixtures for SVM kernels generation". Neural Computing and Applications 23: 1443–1458. doi:10.1007/s00521-012-1094-z.
  7. Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2014). "Object clustering and recognition using multi-finite mixtures for semantic classes and hierarchy modeling". Expert Systems with Applications 41: 1218–1235. doi:10.1016/j.eswa.2013.08.005.
  8. Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2013). "Visual Scenes Categorization Using a Flexible Hierarchical Mixture Model Supporting Users Ontology". IEEE 25th International Conference on Tools with Artificial Intelligence (ICTAI): 262–267. doi:10.1109/ICTAI.2013.48.


This article is issued from Wikipedia - version of the Wednesday, November 18, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.