This repository contains a small, simple and efficient module, implementing various Kullback-Leibler divergences for parametric 1D continuous or discrete distributions.
- Garivier & Cappé & Kaufmann, 2012, for the pymaBandits project on which this implementation is based,
- Besson, 2018, SMPyBandits project for improvements on the initial implementation,
- Filippi & Cappé & Garivier, 2011,
- Garivier & Cappé, 2011,
- Kullback & Leibler, 1951 for the first article introducing the so-called Kullback & Leibler divergences.
If the KullbackLeibler.jl
file is accessible in your PATH or in Python's path:
julia> using KullbackLeibler
julia> KullbackLeibler.klBern(0.5, 0.5)
0.0
julia> KullbackLeibler.klBern(0.1, 0.9)
1.757779...
julia> KullbackLeibler.klBern(0.9, 0.1) # And this KL is symmetric
1.757779...
julia> KullbackLeibler.klBern(0.4, 0.5)
0.020135...
julia> KullbackLeibler.klBern(0.01, 0.99)
4.503217...
You can also use the common function that supports different distributions using the Distributions
module.
julia> using Distributions
julia> Bern1 = Distributions.Bernoulli(0.33)
Distributions.Bernoulli{Float64}(p=0.33)
julia> Bern2 = Distributions.Bernoulli(0.42)
Distributions.Bernoulli{Float64}(p=0.42)
julia> ex_kl_1 = KL( Bern1, Bern2 ) # Calc KL divergence for Bernoulli R.V
0.017063...
julia> klBern(0.33, 0.42) # same!
0.017063...
All functions are not vectorized, and assume only one value for each argument. If you want vectorized function, please ask and we will try to add them.
See this file.
Easy! Download the KullbackLeibler.jl
and copy it to your working folder.
I will add this package to METADATA.jl
as soon as possible, and then it will be possible to install it using:
julia> Pkg.add("KullbackLeibler")
This module was initially a Python module, see here on GitHub.
Dependencies
Distributions
, v0.15+.
MIT Licensed (file LICENSE). © Lilian Besson, 2018.