Today I Learned:
1) ...a solid theoretical reason that information (entropy) is measured as a logarithm. It comes from four axiomatic assumptions about information, chosen to basically make it make sense.
a) Information should be a number between 0 and infinity.
b) If an event has probability 1, there is no information associated with learning that it happens, so the entropy of an event with probability 1 is 0.
c) The entropy of two independent events jointly occurring is the sum of their entropies (that is, the information you learn from two independent events is the sum of their information).
d) The entropy function should be continuous. From these axioms, the only possible functions are logarithms, though the choice of base is up to you.
2) The Soviet Union of World War II may have had the most gender-equal army of all time. Hundreds of thousands of women served in the USSR armed forces as soldiers, elite snipers, pilots, and tank crews.
3) Under certain models of viral infection, it can be very difficult to distinguish successful clearance of viruses from super-effective viruses. They both result in similar time traces of decreasing viral load, one because the viruses get killed and one because the pool of host cells dies off. This is particularly relevant for monitoring HIV.
No comments:
Post a Comment