This function computes all distance values between two probability density functions that are available in getDistMethods and returns a vector storing the corresponding distance measures. This vector is named distance diversity vector.

dist.diversity(x, p, test.na = FALSE, unit = "log2")

Arguments

x

a numeric data.frame or matrix (storing probability vectors) or a numeric data.frame or matrix storing counts (if est.prob is specified).

p

power of the Minkowski distance.

test.na

a boolean value indicating whether input vectors should be tested for NA values. Faster computations if test.na = FALSE.

unit

a character string specifying the logarithm unit that should be used to compute distances that depend on log computations. Options are:

  • unit = "log"

  • unit = "log2"

  • unit = "log10"

Author

Hajk-Georg Drost

Examples

dist.diversity(rbind(1:10/sum(1:10), 20:29/sum(20:29)), p = 2, unit = "log2")
#> Metric: 'euclidean'; comparing: 2 vectors.
#> Metric: 'manhattan'; comparing: 2 vectors.
#> Metric: 'minkowski'; p = 2; comparing: 2 vectors.
#> Metric: 'chebyshev'; comparing: 2 vectors.
#> Metric: 'sorensen'; comparing: 2 vectors.
#> Metric: 'gower'; comparing: 2 vectors.
#> Metric: 'soergel'; comparing: 2 vectors.
#> Metric: 'kulczynski_d'; comparing: 2 vectors.
#> Metric: 'canberra'; comparing: 2 vectors.
#> Metric: 'lorentzian' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'intersection'; comparing: 2 vectors.
#> Metric: 'non-intersection'; comparing: 2 vectors.
#> Metric: 'wavehedges'; comparing: 2 vectors.
#> Metric: 'czekanowski'; comparing: 2 vectors.
#> Metric: 'motyka'; comparing: 2 vectors.
#> Metric: 'kulczynski_s'; comparing: 2 vectors.
#> Metric: 'tanimoto'; comparing: 2 vectors.
#> Metric: 'ruzicka'; comparing: 2 vectors.
#> Metric: 'inner_product'; comparing: 2 vectors.
#> Metric: 'harmonic_mean'; comparing: 2 vectors.
#> Metric: 'cosine'; comparing: 2 vectors.
#> Metric: 'hassebrook'; comparing: 2 vectors.
#> Metric: 'jaccard'; comparing: 2 vectors.
#> Metric: 'dice'; comparing: 2 vectors.
#> Metric: 'fidelity'; comparing: 2 vectors.
#> Metric: 'bhattacharyya' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'hellinger'; comparing: 2 vectors.
#> Metric: 'matusita'; comparing: 2 vectors.
#> Metric: 'squared_chord'; comparing: 2 vectors.
#> Metric: 'squared_euclidean'; comparing: 2 vectors.
#> Metric: 'pearson'; comparing: 2 vectors.
#> Metric: 'neyman'; comparing: 2 vectors.
#> Metric: 'squared_chi'; comparing: 2 vectors.
#> Metric: 'prob_symm'; comparing: 2 vectors.
#> Metric: 'divergence'; comparing: 2 vectors.
#> Metric: 'clark'; comparing: 2 vectors.
#> Metric: 'additive_symm'; comparing: 2 vectors.
#> Metric: 'kullback-leibler' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'jeffreys' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'k_divergence' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'topsoe' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'jensen-shannon' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'jensen_difference' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'taneja' using unit: 'log2'; comparing: 2 vectors.
#> Metric: 'kumar-johnson'; comparing: 2 vectors.
#> Metric: 'avg'; comparing: 2 vectors.
#> euclidean manhattan minkowski chebyshev #> 0.12807130 0.35250464 0.12807130 0.06345083 #> sorensen gower soergel kulczynski_d #> 0.17625232 0.03525046 0.29968454 0.42792793 #> canberra lorentzian intersection non-intersection #> 2.09927095 0.49712136 0.82374768 0.17625232 #> wavehedges czekanowski motyka kulczynski_s #> 3.16657887 0.17625232 0.58812616 2.33684211 #> tanimoto ruzicka inner_product harmonic_mean #> 0.29968454 0.70031546 0.10612245 0.94948528 #> cosine hassebrook jaccard dice #> 0.93427641 0.86613103 0.13386897 0.07173611 #> fidelity bhattacharyya hellinger matusita #> 0.97312397 0.03930448 0.32787819 0.23184489 #> squared_chord squared_euclidean pearson neyman #> 0.05375205 0.01640226 0.16814418 0.36742465 #> squared_chi prob_symm divergence clark #> 0.10102943 0.20205886 1.49843905 0.86557468 #> additive_symm kullback-leibler jeffreys k_divergence #> 0.53556883 0.13926288 0.31761069 0.04216273 #> topsoe jensen-shannon jensen_difference taneja #> 0.07585498 0.03792749 0.03792749 0.04147518 #> kumar-johnson avg #> 0.62779644 0.20797774