Compute Shannon's Mutual Information based on the identity \(I(X,Y) = H(X) + H(Y) - H(X,Y)\) based on a given joint-probability vector \(P(X,Y)\) and probability vectors \(P(X)\) and \(P(Y)\).

MI(x, y, xy, unit = "log2")

Arguments

x

a numeric probability vector \(P(X)\).

y

a numeric probability vector \(P(Y)\).

xy

a numeric joint-probability vector \(P(X,Y)\).

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Value

Shannon's Mutual Information in bit.

Details

This function might be useful to fastly compute Shannon's Mutual Information for any given joint-probability vector and probability vectors.

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See also

H, JE, CE

Author

Hajk-Georg Drost

Examples

MI( x = 1:10/sum(1:10), y = 20:29/sum(20:29), xy = 1:10/sum(1:10) )
#> [1] 3.311973