Since the first incarnation was released to the world in 1989, aspiring city planners have used SimCity as the canvas on which to paint their wildest urban design dreams. The latest installment has received strong reviews (at least, when operable) with many features not previously available. I have a special interest in a major change to the fundamental mechanism by which the denizens 0f SimCity are represented by the game. Heretofore, all statistics were aggregate – that is, the simulation computed average crime and average unemployment in a given regions based on the present conditions, but no individual Sims were present to actually go to work or commit crimes. This has changed in the new version. Each Sim is an individual agent (a la The Sims) with responses to stimuli. As noted in this Penny Arcade webcomic, you can peek in on the activities of your individual charges:
I think it is hard to overstate the importance of this change. One of the major critiques of Keynesian economics articulated by the Austrians like Russ Roberts is the emphasis on lumped variables, like total GDP or aggregate demand, which do not distinguish what made up those values. The classic line is, you can’t go to the store and buy a box of aggregate supply. In a similar way, the behavior of individual people, even with identical preferences, will depend of their particular circumstances.
Jensen’s inequality, put simply, says that the average of a function will only be equal to the value of the function at the average if the function is linear. The books The Flaw of Averages and Antifragile both make this point very distinctly. A statistician, as the joke goes, can drown in a river that is, on average, only 2 feet deep if its has even one section 12 feet deep. Similarly, one hour in freezing weather does not cancel out an hour in blazing desert, even if the “average” temperature is a perfect 72 F.
Agent based modelling is coming in to vogue now that we have the increased computing power to handle the problem. One of my favorite example comes from the study of mortgage refinancing during the heyday of collateralized debt obligations, which consisted of bundles of bonds representing the right to collect the mortgage payments as they came in from various homeowners.. Falling interest rates lead to surge in refinancing – which benefited the homeowners but was not good for the holders of the mortgage since this involves an early repayment of a debt that had be contracted at an higher interest rate. It was noted, however, that these refinacings mostly occurred early on. This is hard to explain if one thinks about the “average” propensity for members of the group to make the effort to refinance when interest rates made this the favorable course of action. In reality, each homeowner has his or her own threashold for refinancing, based on attention paid, tolerance for the hassle involved and particular financial situation. (Since collecting information is itself costly, sometimes rational inattention is really justified). Whatever the reason, there would always be some people who would never refinance no matter how low interest rates fell, and others that would do so at the first opportunity The early refinancers would therefore self-select themselves out of the pool early, leading to a “burn out” in which the holdouts were likely to stick around to the end. The end result is that using a single rate of exit (as in an exponential decay) to model the homeowners leaving the pool and making it constant in time leads to a systematic undervaluing of the CDO. An agent-based simulation makes much more sense.