It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.
Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.
Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org], which is somewhat related, and pretty cool.
wake forest old dominion insync the duchess the duchess katy perry spice
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.