Adjusted mutual information

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings.[1] It corrects the effect of agreement solely due to chance between clusterings, similar to the way the adjusted rand index corrects the Rand index. It is closely related to variation of information:[2] when a similar adjustment is made to the VI index, it becomes equivalent to the AMI.[1] The adjusted measure however is no longer metrical.[3]

Mutual Information of two Partitions

Given a set S of N elements S=\{s_1, s_2,\ldots s_N\}, consider two partitions of S, namely U=\{U_1, U_2,\ldots, U_R\} with R clusters, and V=\{V_1, V_2,\ldots, V_C\} with C clusters. It is presumed here that the partitions are so-called hard clusters; the partitions are pairwise disjoint:

U_i\cap U_j = V_i\cap V_j = \varnothing

for all i\ne j, and complete:

\cup_{i=1}^RU_i=\cup_{j=1}^C V_j=S

The mutual information of cluster overlap between U and V can be summarized in the form of an RxC contingency table M=[n_{ij}]^{i=1 \ldots R}_{j=1 \ldots C}, where n_{ij} denotes the number of objects that are common to clusters U_i and V_j. That is,

n_{ij}=\left|U_i\cap V_j\right|

Suppose an object is picked at random from S; the probability that the object falls into cluster U_i is:

P(i)=\frac{|U_i|}{N}

The entropy associated with the partitioning U is:

H(U)=-\sum_{i=1}^R P(i)\log P(i)

H(U) is non-negative and takes the value 0 only when there is no uncertainty determining an object's cluster membership, i.e., when there is only one cluster. Similarly, the entropy of the clustering V can be calculated as:

H(V)=-\sum_{j=1}^C P'(j)\log P'(j)

where P'(j)={|V_j|}/{N}. The mutual information (MI) between two partitions:

MI(U,V)=\sum_{i=1}^R \sum_{j=1}^C P(i,j)\log \frac{P(i,j)}{P(i)P'(j)}

where P(i,j) denotes the probability that a point belongs to both the cluster U_i in U and cluster V_j in V:

P(i,j)=\frac{|U_i \cap V_j|}{N}

MI is a non-negative quantity upper bounded by the entropies H(U) and H(V). It quantifies the information shared by the two clusterings and thus can be employed as a clustering similarity measure.

Adjustment for chance

Like the Rand index, the baseline value of mutual information between two random clusterings does not take on a constant value, and tends to be larger when the two partitions have a larger number of clusters (with a fixed number of set elements N). By adopting a hypergeometric model of randomness, it can be shown that the expected mutual information between two random clusterings is:

\begin{align} E\{MI(U,V)\} = &
\sum_{i=1}^R \sum_{j=1}^C 
\sum_{n_{ij}=(a_i+b_j-N)^+}^{\min(a_i, b_j)} 
\frac{n_{ij}}{N} 
\log \left( \frac{ N\cdot n_{ij}}{a_i b_j}\right) \times \\
& \frac{a_i!b_j!(N-a_i)!(N-b_j)!}
{N!n_{ij}!(a_i-n_{ij})!(b_j-n_{ij})!(N-a_i-b_j+n_{ij})!} \\
\end{align}

where (a_i+b_j-N)^+ denotes \max(1,a_i+b_j-N). The variables a_i and b_j are partial sums of the contingency table; that is,

a_i=\sum_{j=1}^Cn_{ij}

and

b_j=\sum_{i=1}^Rn_{ij}

The adjusted measure[1] for the mutual information may then be defined to be:

 AMI(U,V)= \frac{MI(U,V)-E\{MI(U,V)\}} {\max{\{H(U),H(V)\}}-E\{MI(U,V)\}}
.

The AMI takes a value of 1 when the two partitions are identical and 0 when the MI between two partitions equals to that expected by chance.

References

  1. 1 2 3 Vinh, N. X.; Epps, J.; Bailey, J. (2009). "Information theoretic measures for clusterings comparison". Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09. p. 1. doi:10.1145/1553374.1553511. ISBN 9781605585161.
  2. Meila, M. (2007). "Comparing clusterings—an information based distance". Journal of Multivariate Analysis 98 (5): 873–895. doi:10.1016/j.jmva.2006.11.013.
  3. Vinh, Nguyen Xuan; Epps, Julien; Bailey, James (2010), "Information Theoretic Measures for Clusterings Comparison: Variants, Properties, Normalization and Correction for Chance" (PDF), The Journal of Machine Learning Research 11 (oct): 2837–54

External links

This article is issued from Wikipedia - version of the Sunday, April 10, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.