Compute Shannon's Mutual Information based on the identity \(I(X,Y) = H(X) + H(Y) - H(X,Y)\) based on a given joint-probability vector \(P(X,Y)\) and probability vectors \(P(X)\) and \(P(Y)\).

MI(x, y, xy, unit = "log2")

Arguments

x

a numeric probability vector \(P(X)\).

y

a numeric probability vector \(P(Y)\).

xy

a numeric joint-probability vector \(P(X,Y)\).

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Value

Shannon's Mutual Information in bit.

Details

This function might be useful to fastly compute Shannon's Mutual Information for any given joint-probability vector and probability vectors.

There are two ways the MI() function could have been implemented.

  • Version one: MI() takes the the joint probability \(P(X,Y)\) as 2D matrix and then internally converts it to a joint probability vector \(xy[i,j]\) through indexing.

  • Version two: MI() takes the the joint probability \(P(X,Y)\) directly as probability vector AFTER the user converts their 2D matrix to a joint probability vector \({ xy[i,j] }\)

We chose to implement version two to give users maximum flexibility on how to define their joint probabilities.

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See also

H, JE, CE

Author

Hajk-Georg Drost

Examples


MI( x = 1:10/sum(1:10), y = 20:29/sum(20:29), xy = 1:10/sum(1:10) )
#> [1] 3.311973