Multiplicity

When grading multiple choice quizzes, I often note that it is much more common for a student to get one wrong than to get a perfect score. This is a good example of the important of multiplicity, which is a cornerstone of thermodynamics. It is also a good opening to approach some of the most interesting examples of how more randomness can make systems more, not less, predictable.

Let’s distinguish between what we will call “macrostates” and “microstates.” A microstate is a complete description of a system, like a gas where we could know and write down (at least in theory) all of the positions and velocities of all the particles. As you might imagine, for any reasonable sized gas, being made up of trillion upon trillions of molecules, this would be completely impractical  and not even very useful, since you could not easily watch the particles move anyway. A macrostate is made up of many microstates that fulfill a desired set of criteria, for example, all microstates that have exactly half of the particles on the right side of a container, and half on the left side. By definition, this involves a loss of information as we “lump” together many microstates in service of our arbitrarily chosen rules. However, we will see how useful this way of thinking is.

First, back to our quizzes. What we call “one wrong” is really the macrostate that include the five microstates (only #1 wrong, only #2 wrong, only #3 wrong, only #4 wrong, only #5 wrong). Let’s start with the simple case in which a student has a 50-50 chance of answering any individual question correctly. In this case, all possible microstates are equally probable. It is then simply a matter of counting the number of microstates that fulfill the rule that you are really interested in, to wit, the total number correct. This “lumping” is done everyday by teachers, who count equally all microstates that have the same number of right answers (assuming that they are all worth the same number of points, of course).

The way to do this is with the mathematical operation called “combinations.” You could ask, how many ways are there to answer exactly three questions right on a five-question quiz. This would be written as “five choose three” or:

For small values, it is often easiest to read off the answer from Pascal’s Triangle, named for mathematician Blaise Pascal despite the fact that he was hardly the first person to discover it. The Triangle is built by starting with ones down the sides and adding together the two numbers directly above:

In the case of “5 choose 3,” the answer is six, since it is five rows down and three numbers in. Notice that “1” is the answer to any question “x choose x,” since there is only one way to get a perfect score, no matter how many questions are on the exam. Also, there is a symmetry between getting the same number wrong or right (“x choose n” has the same value as “x choose x-n”). Thus, just as it is hard to get all the questions right, it is equally hard to get them all wrong. If we relax our assumption that there is a equal chance to get each question right or wrong, we still have to take into account the number of ways to get a certain final score; we just need to add in the fact that each microstate is no longer equally probable. Calculations such as these are called Bernoulli trials, in which a series of individual events, each with same probability, are summed to get a probably that exactly some given number of them occur. This gives rise to the Binomial Distribution.

Something interesting happens when you let the number of trials get very large. As long as the chance for each individual trial to be a success is fixed, the limit of the Binomial Distribution for large n is the Normal distribution.  This is our first taste of the Central Limit Theorem, that tells us that under certain conditions, the mean of  independent trials will converge to a normal distribution. The key is the phase “under certain conditions,” since the central limit theorem works in so many diverse circumstances  that it seems like magic. However, the exceptions are real and important, as will be described later. If there are a huge number of trials, as in the question is the gas particle on the right side or left side, then deviations from the mean become so tiny that they can be completely ignored. To be specific, the standard deviation in absolute terms scales as sqrt(n), so in relative terms, it goes as 1/sqrt(n). Therefore, the chance that an imbalance even as small 50.00001%, is astonishingly minuscule. The more molecules randomly bouncing around, the more predicable the outcome is! This is why no one has even suffocated when all the air molecules in the room decided to randomly congregate on the other side.

Advertisements

Noether’s Certainty Theorem

Werner Heisenberg – He may not know where he is going, but he knows exactly where he is
Amalie “Emmy” Noether – Shown here demonstrating her theorem of conservation of awesomeness

By now, most people have heard of Heisenberg’s Uncertainty principle. It states that, no matter how carefully you measure, there will be some pairs of values that cannot be known with as much precision as you like. Instead, there is a fundamental limit on how much we can know simultaneously about, for example, the position and momentum of a particle. This statement has become a darling of some philosophy majors, since it speaks to human knowledge and its limits, and also, perhaps, because takes a little wind out of the sails of all those snooty physicists who think they know everything. However, there is a much less well known, but connected principle, that vastly increases what we know about the universe and our ability to make predictions. In fact, as much as the uncertainty principle casts a pall over the our conceit that  we can understand everything, there is a “certainty principle” (that goes by a different name, a woman’s name in fact) that provides a foundation for the whole enterprise of science. First, though, an explanation of uncertainty.

There is a simple way to  understand the uncertainty principle.  Imagine trying to determine the position and momentum of an electron. To see where it is and how fast it is going, you need to bounce something off of it. Just like when you see something, say, a basketball, with your eyes under normal conditions, you are watching the rebound of the light particles that fell on it. Let’s say you decide to shine a single particle of light, a photon, on the electron. Light comes it a continuous spectrum of energies (what we call colors, when in the visible range), so you might first try to use a photon with a small amount of energy, so as not to disturb the momentum of the particle very much.  However, a low energy photon has a long wavelength. This limits the resolution of the position, since you cannot resolve anything with greater precision than the wavelength of light you are using to see it. If you instead use a photon with a small wavelength, to better observe the position, the photon will have greater energy and interfere with the measurement of the momentum. There is an inherent trade-off between your knowledge of the position and your knowledge of the momentum, since any measurement you preform will disturb the system, to some extent. You might try to improve you technique and be as careful as you can, but there is a fundamental limit: The more you know about the position, the less you can know about the momentum.

An alternate explanation, one I find to be a more lucid approach is to remember that all particles, matter and photon alike, have properties of waves. To describe an electron with a definite momentum, you would use a sine wave of a certain frequency. Since the wave extends forever in all directions, you know nothing about the position of the electron if it has only one possible momentum. To formulate a localized electron, one needs to add together several momentum states, represented by sine waves of different wavelengths, to end up with a “wave packet.” For the electron’s position to be perfectly known, you would need an infinite number of sine waves, and therefore destroy all information about the momentum of the particle. (Here is a good animation of building up an increasingly localized state by adding more and more momentum states).  Expressing a position state as the sum of momentum states (and vice versa) is the same as preforming the mathematical operation called a Fourier transform. Like playing a orchestral symphony using only tuning forks, Fourier’s principle states that any arbitrary waveform can be produced by adding together a sufficient number of pure tones in the right proportions. This is not just of academic interest; the fact that your .mp3 player can hold your entire music collection depends on the ability of song files to be compressed by expressing them in the shorthand of Fourier transforms.

In fact, Heisenberg’s Uncertainty isn’t just one principle; it applies to several pairs of observable quantities, that, like position and momentum, are “conjugates variables,” by virtue of being Fourier transforms of each other.  Time and Energy are also conjugates of each other, as are the angle of an object and its angular momentum.

But there is a much less widely known principle based on conjugate variables that nevertheless has a huge impact on the way we understand the universe.

Continue reading “Noether’s Certainty Theorem”

Conservation

This week in physics lab my students are testing the principle of conservation of energy by dropping a marble through a photogate and comparing the kinetic energy the marble gained with the gravitational potential energy it gave up as it fell. In a perfect world, these quantities would be equal, and thus total energy conserved. Since we don’t live in a perfect world, energy is continually dissipated by insidious forces like air resistance et al. Regarding this points, I had Serendipitously just read the following comic from xkcd.com and asked my students to write about it:

SOURCE: XKCD.com

What I wanted them to think about is the divergence between the tools, powerful and useful they may be, that we use to describe the world (total energy always stays the same) and what is really practical to do. According to the principle of conservation of energy (also known as the first law of thermodynamics for those keeping track at home) the scheme shown in the comic should be a perfectly acceptable way to “transport wind,” with no loss of intensity. Of course, this comes with the proviso that the turbine must capture all of the wind energy and be perfectly efficient. The power cable must have zero electrical resistance, and the fan must also have a 100% efficiency rating.

However, we know intuitively (or at least, we should) that, in the real world, nothing is perfectly efficient, and the actions of the cartoon’s protagonist will inevitably lead to a colossal waste,  since each step of converting from wind to electricity and electricity to wind entails a loss of energy.

In case you think that the whole discussion is silly, and that no one in their right would convert so profligately from one form of power to another, consider the case of back-up power for server farms. Google has a patent on a method that saves energy by reducing the number of times electricity has be to converted between AC and DC. Evidently, this was an innovation, and existing server farms were build with inefficiency wired-in:

From Google’s “Story of Send.” Click for video

Now, there are limits, even in principle, to the efficiency of some processes. Once you get past the idyllic world of the 1st law, the second law of thermodynamics puts some upper bounds on what you can do even if you try your hardest to squeeze every last bit of efficiency out of certain ways of converting energy between various forms. However, as one of my students pointed out, there is no reason why you couldn’t drive forever in a “perfect” hybrid car that recaptured all of the energy from braking (and going up hills).

 

Physics Nobel

Peter Higgs is going to have to wait at least one more year (although he might be used to waiting by now, since it took researchers almost half a century to find his boson in the first place).  The 2012 Nobel Prize in physics is going to Serge Haroche and David Wineland for their work on observing the quantum state of individual atoms. For my fellow Americans who like to keep track of US Nobel winners as a quantifiable metric of our awesomeness at science,  not only is Dr. Wineland an American, he works for us taxpayers as an employee of NIST. Lest you think that tax dollars should not be frittered on basic science, consider some of the important applications to come from government research, as in: “The next time a GPS keeps you from getting lost, thank a fed”  Now, this current research promises to point the way towards the next generation of quantum computers and optical clocks. More importantly, this work helps us make sense of how, as far as well can tell, the Universe actually works. Far from being a weird curiosity, the paradoxes of quantum mechanics are how things really behave when you peel back all the layers. It’s our everyday experience, where things aren’t (or a least seem to act like they are not) in multiple places at once, that needs explaining.

Continue reading “Physics Nobel”