This funciton computes Shannon's Joint-Entropy $$H(X,Y) = - \sum \sum P(X,Y) * log2(P(X,Y))$$ based on a given joint-probability vector $$P(X,Y)$$.

JE(x, unit = "log2")

## Arguments

x a numeric joint-probability vector $$P(X,Y)$$ for which Shannon's Joint-Entropy $$H(X,Y)$$ shall be computed. a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

## Value

a numeric value representing Shannon's Joint-Entropy in bit.

## References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

H, CE, KL, JSD, gJSD, distance