Monday, October 24, 2016
October 25, 2016 at 12:34AM
Today I Learned: 1) If Python throws an error at you involving receiving instances instead of something else, you probably forgot to put a "self" argument in a method declaration. 2) ...a little bit more about ring species, which I mentioned yesterday. I didn't realize that the bit where the species at the two ends of the geographic range can't interbreed is a *requirement* of a ring species. I thought it was just a thing that *could* happen. I was wrong. Along with that, I learned today that ring species are incredibly rare. Wiki claims there are only four known ring species (three kinds of birds and a tree), and that three of them are controversial (the tree is definitely a ring species). 3) Markov-Chain Monte Carlo (MCMC) is a class of algorithms for drawing random samples from a distribution where you can't explicitly write out the distribution, but you can evaluate the density of that distribution at any given point. A key property of MCMC algorithms is that they're memoriless -- once initialized well, consecutive samples returned by MCMC are independent. Today I learned a bit about how several different flavors of MCMC actually work. Classical Metropolis-Hastings MCMC starts with a "walker" at some point. In each iteration of the algorithm it takes a random walk in any direction, then probabilistically decides to either stay there or go back, based on how high the density of the underlying function it's walking over (the higher the density, the more likely it is to say; intuitively, this means it will spend more of its time where the density is high). Adaptive MCMC works the same way, except that when the adaptive MCMC takes a walk, it isn't totally unbiased in its direction -- based on past results, it tries to learn when there is covariance in the underlying function (imagine a narrow ridge in the function's landscape) and changes what directions it will walk in accordingly. This is to save some computational effort when it encounters very narrow ridges, which are apparently pretty common. Vanilla Metropolis-Hastings would spend tons of iterations jumping off the cliff to either side and rejecting the new value, so it takes it a long time to move around on those ridge; Adaptive MCMC spends most of its time just walking around on the ridge. The MCMC Hammer (a very popular Python package for MCMC) works by having many walkers, all working in parallel. At each iteration, each walker picks another walker and takes a random step towards (or away from?) that walker. This solves the ridge problem much the same way as the adaptive MCMC, but I think it's a bit more robust on certain kinds of functions, and it's *much* more parallelizable for a multi-core system, since multiple walkers can be updated in one operation.
Labels:
IFTTT,
TodayILearned
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment