Update 2017-07-03: Corrected equation for associative definition, thank you /u/Syrak. This may not be the first time someone recognized this, but I have recently discovered some interesting and useful properties of the entropy function and now share them. First a definition: Entropy — H(p 1 , p 2 , ..., p n ) — is a function that quantifies surprise in selecting an object from a set where the probability of selecting each object is given: {p 1 , p 2 , ..., p n }. This has utility in communications, information theory and other fields of math. H b (p 1 , p 2 , ..., p n ) = Σ(i..n)-p i log b (p i ) where b is normally 2, to express entropy in bits. Other definitions of H() use expected values and random variables. As an analog to the definition above, I will discuss entropy of a set of frequencies. p i = f i / Σ f i. Entropy defined without bits: A definition that doesn't use bits is: H(p 1 , p 2 , ..., p n ) = Π(i.....
Comments
Chose 1 switch. Flick it on and off continuously for a good, long while (say, a few days), or alternatively leave it on for a very long time (like, a year or two). The point is that the problem specifies no time limit.
Once you've done that, then with the remaining four switches do the same thing as with the four-switch problem. Turn two switches on for a few minutes, then turn one of them off and turn another on. Quickly enter the room and feel the bulbs.
You should be able to differentiate the last four bulbs between on/hot, on/cold, off/hot, off/cold as with the four-switch problem. The fifth bulb, the one with the broken filament, is the first one you chose.
Previously I thought this assumption fails and even tested it. However, now I see that for lower watt bulbs it is true! Naturally, valid assumptions are given to contestants, and Duncan retroactively gets his point.
You, sir, are however WAY past the deadline.
FYI the cop out answer was: the switches are labeled.