From the greatest fountain of human knowledge ever constructed: Wikipedia.

(All of science is a very close second.)

In mathematical statistics, the **Kullback–Leibler divergence** (also called **relative entropy**) is a measure of how one probability distribution is different from a second, reference probability distribution. Applications include characterizing the **relative **(Shannon) **entropy** in information systems, **randomness** in continuous time-series, and **information gain** when comparing statistical models of inference.

(Source)

### Like this:

Like Loading...

*Related*

## Leave a Reply