Congress Investigates DSGE Models: One Witness Says Big Problem Is The Lack of Brain Power

You may have guessed that I enjoy poking the ruling elite in economics. That was the case with a self-declared tyrant that threatens the careers of my students, a Fed VC who thinks it is best to have neither his students at Columbia nor researchers at the Fed be trained in modern computational skills, and a Harvard professor who insults my fellow Midwesterners. This post is not one of those. V. V. Chari and I were colleagues and buddies at MEDS in the early 1980’s, Chari is part of many of the great memories I have of those days. Chari and I have discussed the issues I raise below. He understands that this is business, and that I take no pleasure in this post.

Chari has always been an articulate expositor of economic ideas. In 2010, he was asked to use those skills to give testimony at the Congressional hearing with the ambitious title “Building a Science of Economics for the Real World.” The witnesses were Robert Solow, Sidney Winter, Scott Page, David Colander and Chari. Under “Purposes”, the record said “If the generally accepted economic models inclined the Nation’s policy makers to dismiss the notion that a crisis was possible, and then led them toward measures that may have been less than optimal in addressing it, it seems appropriate to ask why the economics profession cannot provide better policy guidance.” Great question! 

Chari’s task was to defend DSGE models against the criticisms of the other speakers. When one reads Chari’s remarks you will understand why he was chosen for that job. Chari was an articulate advocate, perhaps as good as Johnny (not John) Cochrane. However, when you read his remarks more carefully you realize it is hard to be a successful advocate if the gloves fit and someone other than Marcia Clark shines a bright light on his arguments.

Chari begins his oral testimony be describing how “Macroeconomic research .. is a very big tent and accommodates a very diverse set of viewpoints.” He notes that state-of-the-art DSGE models “have heterogeneity, all kinds of heterogeneity arising from income fluctuations, unemployment and the like. They have unemployment. They do have financial factors. They have sticky prices and wages. They have crises. And they have a role for government.”

This sounds impressive but from what I have seen it would be more accurate to say that the typical DSGE model has heterogeneity OR sticky prices OR a role for government OR crises OR…etc. This is a weakness in the DSGE literature. As I said in my 1998 book, “suppose that meteorologists took this approach to studying the weather; they would ignore complex models and their “black box” computer implementations and instead study evaporation, or convection, or solar heating, or  the effects of the earth’s rotation. Both the weather and the economy are phenomena greater than the sum of their parts, and any analysis that does not recognize that is inviting failure. Simple, focused theoretical studies give us much insight, but they can only serve as a step in any substantive analysis, not the final statement.”

In his prepared statement, he expands on a “simple message about all models: Models are purposeful simplifications that serve as guides to the real world, they are not the real world.” Chari is absolutely correct on this. However, this is true of all models in any field of science or engineering. All scientists must abstract from irrelevant details. The challenge is to determine which details are irrelevant and which are not. I argue that this is where serious computational tools can be helpful. If intensive computing shows that detail X is irrelevant then it will be safe for researchers to use simpler, more tractable models, that ignore detail X. 

That is not the direction Chari takes. Instead he asserts that “Abstracting from irrelevant detail is essential given scarce computational resources…”  Computational resources are scarce, as is everything in some absolute sense, but this was clearly not a binding constraint for economics in 2010. For example, Imrohoroglu, Imrohoroglu and Joines (International Journal of High Performance Computing Applications, 1993) used the University of Minnesota CRAY X-MP supercomputer to solve models with incomplete markets. Ferreyra (AER, 2007) used Condor, a parallel programming tool using a cluster of Unix workstations at the University of Wisconsin, and reports that her estimation problem took about a week. Yongyang Cai’s 2008 PhD thesis also used Condor parallelizing dynamic programming problems across up to 200 workstations. High performance computing was used in economics before 2010 at both the University of Wisconsin and the University of Minnesota. In 2008, I was a co-PI of CIM-EARTH, a project which received funding from the MacArthur Foundation. In the proposal, Argonne Labs (managed by the University of Chicago) promised access to its supercomputers to help develop high performance software for economics. Supercomputing resources were available to economists before 2010, but economists instead chose to far less powerful computer systems.  It is clear that in 2010 there was absolutely no problem with scarcity of computing power.

If scarcity of computational resources was not a problem, then what was? Chari actually pointed to two problems in his testimony, saying “Abstracting from irrelevant detail is essential given scarce computational resources, not to mention the limits of the human mind in absorbing detail!”

Yes, human minds do have limits, but anyone who has ever attended a supercomputing conference has seen that human minds are able to absorb enormous amounts of information when they use high power computing.

Chari appears to claim that macroeconomics suffered from the limitations of the human mind. When I had lunch with Chari a few years ago, I asked him “What minds were you referring to?” He did not answer the question, but I wish I had had a camera to capture the look on his face. 

Before we pile on Chari, we should note that none of the other panelists — neither Solow,  Winter, Page, nor Colander — rushed to defend the quality of the minds behind macroeconomic research. I often hear comments like this. One leading macroeconomist told me (in an email I have kept) that his IQ was the reason why he does not use modern computational methods. Another economist told me that his economics department focuses on economics training and does not give students training in computational methods because the students are not smart enough to learn both.

I have addressed both of Chari’s criticisms in earlier posts.  My post https://showmethemath.org/2022/01/dsice-dynamic-stochastic-integration-of-climate-and-economy/ discusses our use of a supercomputer to compute the social cost of carbon in the presence of economic and climate uncertainty.  My webpage https://kenjudd.org/published-papers/ has a link to a papers where we use a supercomputer to solve for all Nash equilibria in dynamic games with states. Economists do have access to computing power, at least until other economists take that access away. Furthermore, some economists have the mental ability to harness the power of supercomputers to solve difficult problems in economics.

The posts https://showmethemath.org/2022/02/jpe-does-not-want-papers-written-for-smart-people/ and https://showmethemath.org/2022/02/clarida-to-columbia-students-ask-nyu-professors-for-help/ shows that macroeconomists do not want people to write papers for smart people and don’t want to make their students smarter.

The Congressional Hearing “Building a Science of Economics for the Real World” asked “why the economics profession cannot provide better policy guidance.” It was not scarcity of computational resource in 2010, and it surely is not a constraint in 2022. We are left with the explanation that the problem is a scarcity of brain power, a problem apparently unique to economics.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Inline Feedbacks
View all comments