• Our new model incorporates several innovative features: For example, rather than using a representative household, it features a demographically accurate synthetic population with millions of households (matching age, education, race, and consumption habits). Instead of using a representative firm, we model the behavior of tens of thousands of the largest firms, in one-to-one correspondence with real firms….
        • J. Doyne Farmer, Making Sense of Chaos: A Better Economics for a Better World,1 p. 258
Mainstream economics is willing to build a model of an economy in terms of a “representative individual.” One hypothetical consumer represents every household. One hypothetical firm represents every business. Many different types of workers are aggregated as “labor.” Many different types of machines and other productivity-enhancing factors (such as business reputation or process knowledge) are aggregated as “capital.” I have long questioned this way of doing economics, which I refer to as the “GDP factory” method of analysis.

For decades, J. Doyne Farmer and his relatively small cohort of like-minded researchers have advocated for and implemented a different approach. Borrowing from the field of ecology, they wish to build models incorporating agents that employ different strategies within the overall system.

The representative-individual approach involves carefully choosing a set of assumptions in the economist’s head about human behavior, representing those as equations, and solving the equations for a single equilibrium. It predates the age of the computer.

Farmer’s approach, illustrated in his new book Making Sense of Chaos, requires a very different modeling strategy, called “agent-based modeling.” It starts with observations about how different individuals choose strategies for earning a living, consuming, and investing. The goal is to see how these strategies interact with one another over time. This requires computer simulation.

For example, consider the stock market. The “representative individual” approach assumes a single investor with full information and one strategy for maximizing returns relative to risk. Farmer’s approach instead starts by looking at the types of strategies different investors actually use. Some focus on fundamentals. Others try to spot trends. Everyone has different information and uses different heuristics.

For more on these topics, see

The representative-individual models of the stock market tend to have dynamic properties that are uninteresting and unrealistic. They predict minimal market movement, much less trading than we observe, and nothing like the pattern of run-ups and crashes that seems to characterize existing markets. The models with heterogeneous investors are able to replicate the patterns we actually observe in the stock market.

One of the most interesting findings from agent-based models is that as the influence of players using one strategy increases, the dynamics of the financial market change. Strategies that dampen volatility for a while can suddenly cause instability.

For example, Farmer points out that in the late 1990s major investment banks adopted “value at risk” (VaR) as a strategy for controlling market exposure. VaR measures the loss from, say, an adverse price movement of two-standard deviations. Using such a metric, a risk manager would say that you can increase risk exposure as market volatility declines, and you have to decrease it when volatility goes up. In good times, you get a self-reinforcing feedback loop that raises asset prices as banks expand their portfolios. But then a little adversity leads everyone using VaR to try to sell at once, causing a really severe self-reinforcing loop on the downside. Farmer says that this describes what happened in financial markets before and during the financial crisis of 2008.

Farmer and colleagues also have used computer simulations of heterogeneous-agent strategies to analyze the energy market, with a particular focus on trying to assess the feasibility of an “energy transition” to forestall climate change. According to their analysis, the main cost from shifting toward renewable energy sources is upgrading the electric grid. But actually producing energy will be cheaper, so that overall a faster energy transition is a positive for the economy.

  • In 2050, for example, our estimated global annual expenditure on the electricity network for the Fast Transition is about $670 billion per year, compared with $530 billion per year for the No Transition. However, the expected total system cost in 2050 is about $5.9 trillion for the Fast Transition and $6.3 trillion per year for the No Transition. Thus, although the additional $140 billion of grid costs might seem expensive, it is significantly less than the savings that come from cheaper energy. p. 253

The mainstream approach to doing economic theory will always have the advantage of being easy to communicate and to replicate. When someone shows the results of a mainstream model, you can solve the equations yourself and get a feel for what is driving the results.

For empirical work, replication is not so reliable. Farmer reports that when he was with a company that was interested in exploiting stock market inefficiency, his team looked at published papers on market anomalies.

  • For around half of the papers, we couldn’t reproduce the results, even when we tested the postulated deviation from efficiency using the same data. p. 146
“If economists are going to adopt agent-based modeling, they are going to have to develop ways to articulate, explain, and justify the choices they make in constructing the models.”

Simulations are more opaque to those of us who are not on the team that built the model. We cannot reproduce the results for ourselves. If economists are going to adopt agent-based modeling, they are going to have to develop ways to articulate, explain, and justify the choices they make in constructing the models.

I think of economic models as being like maps. With an old-fashioned triptych, if the map said to take the George Washington Bridge to get from where I live to Boston, I would have been stuck with that. With the map on my smart phone, I can consider alternatives, and even make adjustments in real time based on traffic conditions.

Podcast followup: From the Shelf with Curator Arnold Kling:

For economists, vast amounts of data are becoming available. Computer power has gone up by orders of magnitude. This presumably makes the trends favor agent-based modeling relative to the representative-individual standard.

But as maps for policy makers, agent-based models are still far from reliable. I would be careful not to presume that they make centralized decision-making a good way to operate an economy. One should not bet the farm on Farmer.


Footnotes

[1] J. Doyne Farmer, Making Sense of Chaos: A Better Economics for a Better World. Yale University Press, 2024.


*Arnold Kling has a Ph.D. in economics from the Massachusetts Institute of Technology. He is the author of several books, including Crisis of Abundance: Rethinking How We Pay for Health Care; Invisible Wealth: The Hidden Story of How Markets Work; Unchecked and Unbalanced: How the Discrepancy Between Knowledge and Power Caused the Financial Crisis and Threatens Democracy; and Specialization and Trade: A Re-introduction to Economics. He contributed to EconLog from January 2003 through August 2012.

Read more of what Arnold Kling’s been reading. For more book reviews and articles by Arnold Kling, see the Archive.


As an Amazon Associate, Econlib earns from qualifying purchases.