Monday, June 13, 2016

June 13, 2016 at 04:00AM

Today I Learned: 1) I'm one very small step closer to understanding statistical mechanics. Today I read through most of a 1957 paper on Information Theory and Statistical Mechanics by E. T. Jaynes (see here: http://ift.tt/1UuZbs8). The general thrust of the paper is that you can derive pretty much all the important equations from statistical mechanics by just assuming the least informative probability distribution on whatever quantity you're interested. Jaynes' point is that statistical mechanics (and much of thermodynamics) really boils down to a question of this form: If you know the mean of some value of a bunch of variables, what's the mean of the value of some other function of those variables? A classic example is calculating the temperature of a bunch of molecules given their average energy -- you know the average energy of the system, but you don't know all of the energies of the individual particles, yet you *need* to somehow figure out how fast the molecules are, on average, moving. That's an impossible thing to calculate explicitly, because it's a massively underdetermined system. You can, however, put a probability distribution on the spread of energies across particles, which lets you put a probability distribution on the possible speeds of those particles, which gives you a probability distribution on the temperature of the system. It turns out that if you use the least-informative prior probability distribution on the spread of energies across particles (where here "least-informative" can be formally defined as "highest Shannon entropy"), there's only one such prior, and it gives you the Boltzmann distribution on energies of the particles, and an extremely tightly-peaked posterior distribution on temperatures around the one we actually measure. According to the article, just abuot all of the important equations in statistical mechanics can be derived this way... put a least informative prior on whatever variables you don't know, crunch through Bayes' theorem, and you almost always get a tight posterior around the value scientists think you should get. To be honest, I'm trusting Jaynes on the math, but I'd love to go through it in more detail with someone well-qualified (Bear Bear Bear?). A bonus from this article: I also learned another set of rules that can be used to derive the formula for Shannon entropy. Shannon entropy, which was supposed to quantify the information content or "average surprise" in a probability distribution, was invented using the following assumptions: 1) Entropy is a continuous function of the probabilities of each possible event; 2) If all possible outcomes are equally likely, then more possible outcomes results in higher entropy; and 3) the sum of the entropies of two events is the entropy of the sum of the two events. There is only one functional form that satisfies those three conditions, and that's the one Shannon entropy uses (it can be stated using any base logarithm). 2) The contents of soylent! Actually, of two different formulations of soylent. The one my roommate and I tried yesterday was soylent 1.5, which is mostly brown rice protein, oat flour, and sunflower oil, mixed with all the additional vitamins and minerals known to be important for human health. There's also now a soylent 2.0, which is a bottled meal-drink made from isolated soy protein, algal oil, isomaltulose (a beet dissacharide that supposedly releases energy more smoothly than glucose), and the aforementioned vitamins and minerals. The marketing on soylent 2.0 is a little different from 1.5 -- they're no longer selling it as a total food replacement, rather claiming it's a meal replacement for when you just need some cheap food without having to cook. They also claim it's tastier (we'll see) and much more environmentally friendly due to the large fraction of algae oil, which is produced in high density in bioreactors. 3) My pogonomyrmex ants aren't particularly into soylent. I gave them a crumb of soylent cookie, and a couple of workers sat on it for a while, but they didn't stay for long.

No comments:

Post a Comment