This function computes a distance matrix or distance value based on the JensenShannon Divergence with equal weights.
JSD(x, test.na = TRUE, unit = "log2", est.prob = NULL)
x  a numeric 

test.na  a boolean value specifying whether input vectors shall be tested for NA values. 
unit  a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. 
est.prob  method to estimate probabilities from input count vectors such as nonprobability vectors. Default:

a distance value or matrix based on JSD computations.
Function to compute the JensenShannon Divergence JSD(P  Q) between two probability distributions P and Q with equal weights \(\pi_1\) = \(\pi_2\) = \(1/2\).
The JensenShannon Divergence JSD(P  Q) between two probability distributions P and Q is defined as:
$$JSD(P  Q) = 0.5 * (KL(P  R) + KL(Q  R))$$
where \(R = 0.5 * (P + Q)\) denotes the midpoint of the probability vectors P and Q, and KL(P  R), KL(Q  R) denote the KullbackLeibler Divergence of P and R, as well as Q and R.
General properties of the JensenShannon Divergence:
1)
JSD is nonnegative.
2)
JSD is a symmetric measure JSD(P  Q) = JSD(Q  P).
3)
JSD = 0, if and only if P = Q.
Lin J. 1991. "Divergence Measures Based on the Shannon Entropy". IEEE Transactions on Information Theory. (33) 1: 145151.
Endres M. and Schindelin J. E. 2003. "A new metric for probability distributions". IEEE Trans. on Info. Thy. (49) 3: 18581860.
HajkGeorg Drost
# JensenShannon Divergence between P and Q P < 1:10/sum(1:10) Q < 20:29/sum(20:29) x < rbind(P,Q) JSD(x)#>#> jensenshannon #> 0.03792749# JensenShannon Divergence between P and Q using different log bases JSD(x, unit = "log2") # Default#>#> jensenshannon #> 0.03792749JSD(x, unit = "log")#>#> jensenshannon #> 0.02628933JSD(x, unit = "log10")#>#> jensenshannon #> 0.01141731# JensenShannon Divergence Divergence between count vectors P.count and Q.count P.count < 1:10 Q.count < 20:29 x.count < rbind(P.count,Q.count) JSD(x.count, est.prob = "empirical")#>#> jensenshannon #> 0.03792749# Example: Distance Matrix using JSDDistance Prob < rbind(1:10/sum(1:10), 20:29/sum(20:29), 30:39/sum(30:39)) # compute the KL matrix of a given probability matrix JSDMatrix < JSD(Prob)#>