Time and Temperature

I found this article very interesting, mostly due to the way a bold prediction – the data stored on this disk will be stable for a million years! – is based completely on probabilistic theory. From the news article:

“The probability that the system will jump in this way is governed by an idea known as Arrhenius law. This relates the probability of jumping the barrier to factors such as its temperature, the Boltzmann constant and how often a jump can be attempted, which is related to the level of atomic vibrations.”

Temperature is a measure of the average thermal energy available to the molecules of the system (kT). Since the particles are moving around and colliding randomly, energy is constantly being exchanged and there is a chance that one will have significantly more energy than the average. In fact, the probability is given by the exponential Boltzmann distribution. If there is an energy barrier (W), that is, an intermediate state higher in energy than the reactants, only the particles that are “thermally activated” can participate. This allows the rate constant for the reaction to be calculated using transition state theory. Basically, the expected time to wait for a reaction to occur scales Exp[W/kT], reflecting the same distribution of energy in the constituent molecules. Since the Boltzmann distribution maximizes the entropy for a given average energy, it is the overwhelmingly the most probable state of affairs. However, there is always (a very small) chance that large enough deviations will occur much faster. So there is no guarantee that you data will be safe a million years hence, but the odds are in your favor if the activation barrier is large enough.

 

File:Rxn coordinate diagram 5.PNG

 

In biology, enzymes control the processes of life by selectively lowing the energy barriers at the appropriate times. Usually this is achieved by stabilizing the transition state (often by having the right charges in the right places). As a result, some reactions that would take millions of years in the absence of enzymes occur in milliseconds. This change by a factor of 10^17 comes from the exponential dependence in the Arrhenius equation.

Risk Pooling

The lease recently expired at our apartment, and we moved from a privately owned townhouse to a brand new managed apartment building. Interestingly, we did not have to put down a deposit, in contrast to with our old landlord, who had required the standard deposit equal to one month’s rent. In the new place, we only had to pay a one-time nonrefundable “insurance premium” of a couple hundred dollars. The big difference is that our landlord (as far as I could tell) only owned one property, so he was exposed to more variability risk than the management company, who ran hundreds of units. As a result, he needed more assurance that he would not incur a large loss, even though holding a deposit  might be costly (State law requires the money to be held in a separate account, and any interest accrues to the tenant etc.) The whole reason insurance exists at all is the “risk aversion” created by a non-linear function that maps outcomes into utility. That is, a big loss can be devastating, while the small cost of the premiums is not proportionally missed. Insurance for large losses is just a formalized way of taking advantage of the law of large numbers to average out risk, in a sense, taking from “alternate yous” that live in universes where things turn out OK. Thus, insurance is supposed to be boring (as opposed to Casinos), in the sense that it is designed to average out adverse events. This is why insurance commercials are made exciting in ways that have nothing to do with insurance.

 

Insurance is a hot topic with the roll-out of Obamacare. Any health insurance system needs many healthy people paying premiums in order to cover the costs incurred by people who get really sick. Under the previous system, insurance companies made sure that people didn’t just wait until they got sick to sign up by denying those with preexisting conditions. This created the incentive for everyone to join the pool, although some still couldn’t afford insurance and took their chances. Now, in order to get rid of the preexisting condition rules, Obamacare mandates that everyone have some form of insurance.

Rational Markets?

Today the Nobel Memorial Prize in Economics (which, for the record, started in 1968, and was not among the original prizes established by Alfred Nobel’s will) was awarded for the efficient market hypothesis. The funny thing is, one of the winners, Eugene F. Fama,  is known for establishing the theory, while another, Robert J. Shiller, is famous for arguing against the same idea. The basic idea is that stock markets do a good job of incorporating all available data in pricing shares, to the extent that price movements follow a basically random-walk pattern as new events occur. In the weak form, this means that an individual investor cannot count on obtaining returns above the market rate by studying past prices in an attempt to find a pattern and bet accordingly, since if such a pattern existed, it would be erased by other investors who are also looking as the same data. In the strong form, prices are so perfectly tuned to reflect all information, public and private, to the extent that no one can ever achieve excess returns. Investing is just a matter of luck (random walk), although riskier investments (that is, larger volatility) are rewarded with larger average returns. Shiller, who also lends his name to the Case-Shiller index of house prices, shows some very compelling data that stock prices are much more volatile than would be expected if investors were simply tabulating the present value of a company’s dividends. In what should not be a surprise to anyone, the stock market is susceptible to bubbles, booms, and busts.  [I strongly recommend listening to Prof. Shiller’s lectures at Yale University, available for free on iTunes U]

From a probability point of view, both models are interesting. There is a great deal of mathematical theory on random walks (or martingales) if one subscribes to the efficient market hypothesis. In fact, this view has helped increase interest in index funds, for a simple reason: If you can’t beat the market, just “buy the market” instead of paying someone else to gamble with your money. On the other hand, Robert Shiller’s more realistic assessment shows that stock markets are complex dynamic systems in which effects (rising stock prices) can become their own causes (more buying), so bubbles can be formed out of these vicious cycles. It might seem weird that the committee would honor both points of view.  But this is how science works. It is still useful to think about the efficient market hypothesis even if it is not really true, just as it is useful to consider friction-less surfaces and masses ropes in physics. The ideal model (like all models) is wrong, but may still be a major advance. We should be aware that, if everything is working right, it should be impossible for the average investor to beat the market by being clever, but sometimes systems fail in spectacular ways.

Some have said that the overall message is that we have some hope of predicting the moving of the market over the long term (years), but no chance of predicting  the short term movement (days). Maybe this is a rebuke to the entire financial media industry, that tries to explain why the market moved up or down each session based on this or that piece of news. Adjust your expectations of “predictability” accordingly.

2013 Nobels

As widely expected, Peter Higgs was awarded the Nobel Prize in Physics for predicting the existance of a particle (or better, a mechanism) that breaks the symmetry of massless particles, like photons, and massive particles, like quarks. He only had to wait five decades for the Large Hadron Collider to be built (and collect data this past year confirming the existance of the Higgs Boson) in order to claim his prize. This is an extreme example of the ability of the human mind to predict theoretically what cannot be achived experimentally for a long time.

In chemistry, the prize was awarded for the development of quantum chemistry techniques that allow the simulation of complicated molecles, including enzymes. Before, simulating these systems seemed intractable, since the reactions relied on quantum mechanics (which involves computationaly difficult calculations) for the active sites where electrons were jumping between molecules, but simulating entire macromoles would take way to much computing power. The laurates overcame this problem by breaking the simulation into regions where classical mechanics works fine (basically, outside the active site) and regions where the full computation was required. This hybrid model allow the best of both worlds: the accuracy of quantum mechanics when needed, and the computation speed when it’s not. This is a great example of model building – the key is usually figuring out what you can safely ignore. As the saying goes: “All models are wrong, some models are useful