### Interesting properties of the entropy function

Update 2017-07-03: Corrected equation for associative definition, thank you /u/Syrak.

This may not be the first time someone recognized this, but I have recently discovered some interesting and useful properties of the entropy function and now share them.

First a definition:

Entropy — H(p

Entropy defined without bits:

A definition that doesn't use bits is:

A recursive (associative) definition of entropy:

The entropy function is associative. Or maybe it should be said that the entropy function can be applied recursively. Consider the function as applying to frequencies (rather than probabilities).

If we have the frequencies f = {17, 24, 16, 0, 9}, we can find the entropy H(f) if we have a bifurcation of the set f

One case of the above definition is when f

And we can use this without bits as well

An iterative definition of entropy:

Entropy can be defined directly from the symbol frequencies f

This may not be the first time someone recognized this, but I have recently discovered some interesting and useful properties of the entropy function and now share them.

First a definition:

Entropy — H(p

_{1}, p_{2}, ..., p_{n}) — is a function that quantifies surprise in selecting an object from a set where the probability of selecting each object is given: {p_{1}, p_{2}, ..., p_{n}}. This has utility in communications, information theory and other fields of math.H_{b}(p_{1}, p_{2}, ..., p_{n}) = Σ(i..n)-p_{i}log_{b}(p_{i})

*where b is normally 2, to express entropy in bits.**Other definitions of H() use expected values and random variables. As an analog to the definition above, I will discuss entropy of a set of frequencies. p*

_{i}= f_{i}/ Σ f_{i.}Entropy defined without bits:

A definition that doesn't use bits is:

H(pAnd you can convert between the definitions with H_{1}, p_{2}, ..., p_{n}) = Π(i..n) p_{i}^{-pi}

_{b}= log_{b}H. H() is also called*perplexity.*A recursive (associative) definition of entropy:

The entropy function is associative. Or maybe it should be said that the entropy function can be applied recursively. Consider the function as applying to frequencies (rather than probabilities).

If we have the frequencies f = {17, 24, 16, 0, 9}, we can find the entropy H(f) if we have a bifurcation of the set f

_{a }+_{ }f_{b = }f, entropy of the bifurcations H(f_{a}), H(f_{b}), and the total of each set's frequency, Σ f_{a}, Σ f_{b}.H_{b}(f_{a }+_{ }f_{b}) = H_{b}(f_{a}) Σ f_{a}/ (Σ f_{a }+ Σ f_{b}) + H_{b}(f_{b}) Σ f_{b}/ (Σ f_{a }+ Σ f_{b}) + H_{b}({Σ f_{a}, Σ f_{b}})

One case of the above definition is when f

_{b }contains a single element, i. (We use t = Σ f

_{a }hereafter.)

H_{b}(f_{a }+_{ }i) = H_{b}(f_{a}) t/(t+i) + t/(t+i) log_{b}[(t+i)/t] + i/(t+i) log_{b}[(t+i)/i]

H(f_{a }+_{ }i) = H(f_{a})^{t/(t+i)}* (t+i)/i^{i/(t+i)}* (t+i)/t^{t/(t+i)}

Entropy can be defined directly from the symbol frequencies f

_{i}, and not the probabilities p_{i. }This is useful in computations where the entire symbol set is not know in advance and you want to avoid two passes (one to find the total, one to calculate), especially in parallel processing; or where arithmetic is expensive and precalculation will be used.HAnd, for completeness, the iterative definition of entropy without bits:_{b}(f_{1}, f_{2}, ..., f_{n}) = log_{b}(t) - 1/t Σ(i..n) f_{i }log_{b}(f_{i}).

H(f_{1}, f_{2}, ..., f_{n}) = t (Π(i..n) f_{i}^{fi})^{-1/t}

## Comments