Thursday, March 31, 2016
March 31, 2016 at 03:02AM
Today I Learned: 1) Thermodynamics! Specifically, why temperature plays the role it does in chemical reactions, as reasoned from the Boltzmann distribution. If you're well-versed in statistical mechanics, you probably aren't going to learn anything from this post. For the rest of y'all, I'm going to try to explain the role temperature plays in chemistry, using basic statistical mechanics. You'll need some high school algebra, internet access and a browser capable of using wolframalpha.com, and some patience (yeah, these thermodynamics TILs tend to end up pretty long). I'll also be writing out equations, and Facebook doesn't make writing out equations particularly easy. It should be pretty straightforward if you follow along on paper. First, let's define a state. That's pretty simple -- a state is an arrangement of atoms (and their momentums) of some collection of atoms. Actually, it doesn't have to be atoms -- you can apply statistical mechanics to anything where you can define states, energies over states, and temperatures, but atoms are the usual agents in thermodynamics. Let's consider a super-simple system consisting of a coin with the states "lying flat on a table" (I'll call it "FLAT") and "standing on its edge" (I'll call it "EDGE"). States are said to have an energy (or, more fomally, a Hamiltonian) that describes in a number how "stable" a state is (there are other ways to interpret what an energy is -- I don't have a way to explain it from first principles). The lower the energy of a state, the more stable it is. Let's assign some hypothetical energies to the coin states. The FLAT state is much more stable, so we'll give it a lower energy. Lets say FLAT has energy 1, while EDGE has energy 1,000. One of the more fundamental results of statistical mechanics (which I'm taking as brute fact) is the Boltzmann distribution. Statistical mechanics tells us (again, as I'm taking as brute fact) that systems with certain properties follow the Boltzmann distribution (basically, where you randomly sample states A LOT, which is generally true for chemistry and which we will assume is true for the coin example by assuming we either observe a hell of a lot of coins or we come back to the same one over and over again, with enough time in between that we can't predict the state of the coin from the last state it was observed in). What's does that mean? It means this -- the probability of a system being in a state x is proportional to e^-(E(x)/kT) where e is the number e, E(x) is the energy associated with state x, and T is the temperature. Oh, and there's k, the Boltzmann constant, which is an extremely important cosmologic constant and super-important for making units work out in this equation... but it's also just a number, so I'm going to drop it from here on out*. So, what does this say? It says that as the energy of a state gets higher, the probability of that state goes down essentially exponentially. I say the probability of a state is *proportional* to the thing above, but what's the *actual numeric probability* of a state? To find that, we use the fact that the sum of the probabilities of all of the states has to add up to 1, which means that if you divide the above thing by the sum-of-those-above-things-for-all-possible-states, you get the actual probability of that state. Written out, p(x) = e^-(E(x)/T) / Σ(e^-(E(s)/T)) where the sum is over all states s (this would be a good place to start following along on paper, if you aren't already). This is super-important, because it means the probability of each state is affected by the probabilities of all other states. Ok, so what's temperature doing in this equation? Let's consider the coin example again. What's the probability of the coin being on its edge, according to the Boltzmann distribution? There are only two possible states, so it's pretty easy to write out that equation above: p(EDGE) = e^-(E(EDGE)/T) / (e^-(E(EDGE)/T) + e^-(E(FLAT)/T)) There are a lot of terms with T in them up there. We can simplify things by dividing the numerator and denominator through by e^-(E(EDGE)/T), yielding: p(EDGE) = 1 / (1 + e^-(E(FLAT)-E(EDGE)/T)) That's much nicer. Lots more 1's, much fewer T's. Note that this is basically a simple function of two variables: there's T, the temperature; and there's E(FLAT) - E(EDGE), the difference between energies of the two states**. Let's call that difference X. To get an idea of what this function actually looks like, check it out on wolframalpha.com with T = 1 (http://ift.tt/1pMv7Qq). Positive values of X mean that FLAT has a higher energy (is less stable), while negative values of X mean that EDGE has a higher energy. If the two states have the same energy (X=0), then p(EDGE) is 0.5 -- the two states are equally likely. As X gets higher (FLAT has higher relative energy), the probability of EDGE gets higher pretty quickly, until it effectively saturates at 1. As X gets lower (EDGE has higher relative energy), the probability of EDGE drops dramatically until it bottoms out at 0. This matches intuition -- the more stable (lower-energy) state is much more likely. For our example, we said that E(FLAT) = 1, while E(EDGE) = 1000, so we're WAY off to the left -- FLAT is extremely unlikely. That's at temperature 1. What happens if we turn up the heat? Well, the only thing T does in the equation above is to modify X. Specifically, T acts to *scale* X -- as T increases, X/T gets closer to zero. It's kind of like the X-axix gets stretched out; or, equivalently, raising the temperature pushes you towards X = 0 in the plot I linked to above. As an example, let's try turning T up to 1000: http://ift.tt/1M2hA1D. Now you can see that 999 difference in energy isn't so much -- EDGE is now relatively likely. Raising the temperature *flattens out* the probabilities of different states by moving them from the edges of our probability graph toward the middle (http://ift.tt/1pMv5YT to see the probability of EDGE as a function of both X AND T). Conclusion: high temperature makes all states equally probable. Low temperature makes low-energy states much more likely. Now, at this point, if you're like me, you're thinking something's terribly wrong with this picture. After all, when you turn on your stovetop burner, your water doesn't start taking on random states. It does something very specific -- it heats up, then boils. What gives? To understand what's going on here, we have to think about the *microstates* of the pot of water vs its *macrostates*. The *microstates* of a system are really what I've been talking about above -- they're all of the possible arrangements of atoms in a system. That's too many atoms to individually track, so we usually talk about the *macrostate* of a system, which is a collection of techncially different but experimentally indistinguishable microstates. For instance "the water is liquid" is a macrostate -- there are bazillions of ways you can arrange the molecules in that pot of water that will still look liquid, so we call it a single macrostate. "The water is a vapor" is another macrostate. Some macrostates have a LOT MORE possible microstates than others. We say that those macrostates have high "entropy". That's all entropy is -- the number of microstates that look like a macrostate. What's happening when you boil water is that water vapor has MASSIVELY more states than liquid water, but microstates that look like water have much lower energy. When the temperature is low, the low-energy states are favored strongly, and your pot of water stays bound together as a liquid. When the temperature is raised enough, suddenly the probabilities of all of the states become more or less equal***. But! There are tons more "water is a gas" states than there are "water is a liquid" states, and when you add together all of the tiny-and-equal probabilities for the water-is-gas microstates, they collectively massively outweigh the sum of the probabilities for the tiny-and-equal water-is-liquid microstates. In other words, higher entropy macrostates are favored at high temperature. And there you have it -- the mathematics behind why low temperatures favor low energy and high temperatures favor high entropy. * If you're not comfortable with my dropping constants randomly, just pretend that for the rest of the TIL, T is actually the Boltzmann constant times temperature. ** You may have heard that there's no thing as absolute energy, only relative energy. Perhaps this gives some intuition into why -- if you add some constant amount to all of the energies of the states in a system, the resulting distribution of states is indistinuishable. *** I suspect, but have not yet convinced myself, that the exponential-looking shape of the plot of microstate probability around 0 energy difference are at least part of why you tend to see sharp phase transitions when you raise the temperature in a lot of setups -- changing the temperature mostly slides you around the flat tails of that distribution, where microstate probabilities aren't really affected much, until you suddenly hit a bit where microstate probabilities start exponentially moving away from 0 and 1. This produces a relatively sharp transition from an energy-dominated regime to an entropy-dominated one. The trouble is that temperature change doesn't move you linearly in that graph, and I don't have a good enough intuition for exactly how it does move you -- can anyone confirm or deny? Chris Lennox? Robert Johnson? Suzannah Fraker? Andrew Andy Halleran? Anders Knight? 2) Portable soup! Portable soup is the ancestor of modern boullion cubes. Portable soup was invented in the 18th century as a food for sailors. It's basically a soup, concentrated down to a thick gelatinous substance that can be stored apparently indefinitely. You can either dissolve it back into soup or just chew it. Portable soup was used as foodstuff in the Brittish Royal Navy until around 1815, at which point research suggested that it wasn't actually particularly good for sailor health, and was replaced by canned meats. Thanks to Tara Sullivan for enlightening me about portable soups. Many more juicy details on the portable soup wiki page (no joke: http://ift.tt/1SpBCRV) 3) ...how negative autoregulation can make genetic responses faster. See, the amount of a protein a cell has is determined by two factors: how quickly that protein is produced, and how quickly that protein is broken down (or diluted out). It turns out that the *speed at which a cell can change* the amount of protein in the cell is largely determined by the speed of degradation. Yes, if you're turning on a gene, you can make it respond faster by cranking up the production rate, but then you end up producing a lot *more* of that protein in the end, so you need to couple that with a higher degradation rate anyway to maintain the same steady-state protein level. ...unless the gene represses itself. Then you can have a high initial production rate when the gene is just turning on, which gives you a fast 'on' response, but the gene can regulate itself to whatever steady state you need. Note that this doesn't help at all with *deactivation* of a gene -- then you're still limited by how quickly the protein is degraded. * when a gene represses its own production Bonus fact, courtesy of Mengsha Gong: The cluster fig, though a beautiful tree, is an interesting kind of nusiance plant -- its roots are really, *really* good at finding water, and will happily search out pools, septic tanks, and sprinkler systems.
Labels:
IFTTT,
TodayILearned
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment