I have been accepted into the Google SOC program. I will be working on session management for KDE. If those words dont mean anything to you, it is unfortunate because there will now be technical content in my blog.
Interesting properties of the entropy function
Update 2017-07-03: Corrected equation for associative definition, thank you /u/Syrak. This may not be the first time someone recognized this, but I have recently discovered some interesting and useful properties of the entropy function and now share them. First a definition: Entropy — H(p 1 , p 2 , ..., p n ) — is a function that quantifies surprise in selecting an object from a set where the probability of selecting each object is given: {p 1 , p 2 , ..., p n }. This has utility in communications, information theory and other fields of math. H b (p 1 , p 2 , ..., p n ) = Σ(i..n)-p i log b (p i ) where b is normally 2, to express entropy in bits. Other definitions of H() use expected values and random variables. As an analog to the definition above, I will discuss entropy of a set of frequencies. p i = f i / Σ f i. Entropy defined without bits: A definition that doesn't use bits is: H(p 1 , p 2 , ..., p n ) = Π(i.....
Comments