Sunday, June 19, 2016

June 20, 2016 at 02:15AM

Today I Learned: 1) ...some delightful tidbits about parakeets, via Ching-Ching Shiue. I'm actually not sure which species we were talking about, so... keep that in mind. Your milage may vary for other parakeets. Firstly, those birds grow up quickly! It's a couple of weeks from egg-laying to hatching, and by a month of growth they look more or less like grown parakeets, though smaller. Apparently it's really important to let mother parakeets keep their eggs, or to give them a replacement. Taking them away is quite stressful. What does a stressed parakeet look like? Well, one way to tell the comfort of a bird is to check for fluffing. A bird with fluffed feathers is more likely to be happy (or cold -- fluffed feathers are a great way to stay insulated), while a smooth-feathered bird is more likely to be on edge (or hot). Making noise may also be a sign of comfort in a bird -- scared parakeets aren't likely to make a lot of sound, which makes some sense. 2) There are a *lot* of mantis species. About 2400, to be (sort of) precise. I haven't been able to find solid figures on how many of those live in the US, but there seem to be somewhere around a dozen common species. 3) So, I've been wondering for a while what happened to Moore's Law when it comes to processing speed. I remember seeing 4.0 GHz one-core processors (and 2.5-ish GHz two-core processors) on the market about 7 years ago. By Moore's law, that means we should be seeing processors operating at something like 50 GHz by now. We don't. Most commercial processors are now in the 2.5-3.5 GHz range. What gives? Well, it turns out that Moore's law for processors has been doing just fine, at least in the supercomputer world. Both highest-end processor speed and cost-per-FLOP have been improving roughly exponentially at least until a couple years ago, which is where the data on Wiki stops. Part of this is increasing paralleliztion -- more cores per processor die -- but that can't be most of the improvement. I'm actually still not certain how processors are getting faster without their clock speeds getting faster, but I think it has to do with improved efficiency and bigger caches. Fun fact: In 1961, a gigaFLOP of processing power would have cost roughly $8.3 trillion. Actually, I'm sure it would have actually cost much more than that to do -- by the time you built half of the computers required to get to a gigaflop, I suspect you'd have unbalanced the world market for things like heavy metals enough to make it even more expensive. As of January 2015, that cost is more like $0.08 (someone built an 11.5 teraFLOP supercomputer for $902.57).

No comments:

Post a Comment