Compute the Shannon's Entropy \(H(X) = - \sum P(X) * log2(P(X))\) based on a given probability vector \(P(X)\).
H(x, unit = "log2")
a numeric value representing Shannon's Entropy in bit.
This function might be useful to fastly compute Shannon's Entropy for any given probability vector.
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
H(1:10/sum(1:10))
#> [1] 3.103643