Update 2017-07-03: Corrected equation for associative definition, thank you /u/Syrak. This may not be the first time someone recognized this, but I have recently discovered some interesting and useful properties of the entropy function and now share them. First a definition: Entropy — H(p 1 , p 2 , ..., p n ) — is a function that quantifies surprise in selecting an object from a set where the probability of selecting each object is given: {p 1 , p 2 , ..., p n }. This has utility in communications, information theory and other fields of math. H b (p 1 , p 2 , ..., p n ) = Σ(i..n)-p i log b (p i ) where b is normally 2, to express entropy in bits. Other definitions of H() use expected values and random variables. As an analog to the definition above, I will discuss entropy of a set of frequencies. p i = f i / Σ f i. Entropy defined without bits: A definition that doesn't use bits is: H(p 1 , p 2 , ..., p n ) = Π(i.....
Comments
Since I had my undergrad study at Beijing Languages and Culture University(BLCU) where exactly the HSK test is held, I was quite familiar about the test.
I checked the website, there are three levels of HSK available: Basic; Junior & Middle; Senior. When you say Beginner Level,I take it as Basic level..
However, there are only two tests for Basic Level this year: One on April 18th, 2009, the other one on Novemebr 28th, 2009. I am not sure if you are gonna make it...As for Junior & Middle Level, the closest one is on Octomber 18th, 2009.
Besides, I know some guy named 'Jeff Barnes'. He goes to Drexel too and he had spent 4~5 years studying Chinese at BLCU. I believe he was actually taking the HSK exams..You can find him on facebook or probably you knew him already.:-)
I will see what else I can find..
Feel free to let me know anything I can be of any help.
Best,
Emmy
But if you wanna get one beforehand, you may go to Chinatown and walk in one of the bookstores to see if there is one..(you may call first~)
Or maybe you can check with Triple A's local branch..
Emmy