Compute Shannon's Conditional-Entropy based on the chain rule $$H(X | Y) = H(X,Y) - H(Y)$$ based on a given joint-probability vector $$P(X,Y)$$ and probability vector $$P(Y)$$.

CE(xy, y, unit = "log2")

## Arguments

xy a numeric joint-probability vector $$P(X,Y)$$ for which Shannon's Joint-Entropy $$H(X,Y)$$ shall be computed. a numeric probability vector $$P(Y)$$ for which Shannon's Entropy $$H(Y)$$ (as part of the chain rule) shall be computed. It is important to note that this probability vector must be the probability distribution of random variable Y ( P(Y) for which H(Y) is computed). a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

## Value

Shannon's Conditional-Entropy in bit.

## Details

This function might be useful to fastly compute Shannon's Conditional-Entropy for any given joint-probability vector and probability vector.

## Note

Note that the probability vector P(Y) must be the probability distribution of random variable Y ( P(Y) for which H(Y) is computed ) and furthermore used for the chain rule computation of $$H(X | Y) = H(X,Y) - H(Y)$$.

## References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

H, JE