My advice to new PhDs on the job market

My 42 years of experience with economists have taught me much about academic economics. Here are four points you should consider.

1: Be prepared for the ignorance of your interviewers

A student of mine was asked about the computational method he used. He tried to explain the interior point method he was using, but the interviewer declared that the algorithm had to be using NFXP. My student earned a PhD in computational mathematics after 2000, whereas the interviewer received his PhD before the interior point revolution in math programming. That did not matter to the economist interviewer. He had the common belief that NFXP was the only way one could solve a dynamic discrete choice empirical problem. Of course, this has also been declared to be true by many leading economists in their Econometrica papers.

I have had the same experience in seminar presentations. One of my favorite presentations was on the MPEC approach to structural estimation to an audience that included a pack of hyenas with PhDs from elite universities constantly barking about how what I had done was impossible. I enjoyed it because I am old and knew what to expect.

Economists are poor at mathematics and some become very hostile if you try to help them do better math. When you criticize existing work, the authors may view your work as undermining their case for a Nobel Prize. Not all economists are a problem. There are some economists, such as theorists, who generally value the introduction of better mathematical methods into economic analysis. There are also many economists who have no delusions of a future Nobel Prize and are eager to learn better methods. I think I have given you sufficient information regarding who to worry about.

2: Read the fine print

There is a common belief that the first appointment is a three-year contract, and that you can be fired after one year only for cause. Verbal descriptions of your job are irrelevant. BEFORE YOU TAKE A JOB, READ THE FINE PRINT. Only the written document matters.

3: Be careful when sharing ideas

You will be presenting your job market paper in your seminar but you will be asked to describe your other research ideas in your office visits with individual professors. You need to be careful about what you tell people because there is no record of what you say in those interviews. If you tell someone about an idea you have, but then see that person writing similar work, you have to remain silent. No one will believe you. I have no example I can share with you regarding a student’s discussion of research ideas, but I can speak to journal attitudes. One editor told me that once you put something out there, any one can use it in their own work, with or without attribution. I know of (and can document) two cases where authors made errors in their paper, other people identified the errors and provided the correct analysis, and the editor thought it was fine for the authors to incorporate the corrected analysis — done by others — into their published papers without acknowledging or thanking those who did the correct analysis.

4: Consider industry jobs

People like me want the academic life and would never consider going to industry. I used to tell young people with mathematical and computational skills about the great opportunities in academic economics for their skills. I no longer do that. Many people my age have stopped doing serious research in favor of consulting because they are unhappy with the current situation with journals and academics in general. I am unhappy about the brain drain to industry but I must admit that academia has lost much of its appeal. One old guy told me that economics is corrupt and always will be, and that no amount of criticism by me or anyone else will change that.

It is clear to me that the past 40 years has seen a substantial decline in the intellectual standards in economics along with an increase in opportunities in industry. If I were a new PhD, I would probably still go into academia but it would be a much more difficult decision today than it was 42 years ago. I recommend that you keep an open mind regarding non-academic jobs.

DSICE: Dynamic Stochastic Integration of Climate and Economy

I started working on climate change policy modeling in 2008, and it has been a major focus of my efforts since then. In 2010, Yongyang Cai, Thomas Lontzek and I created the DSICE model, extending Nordhaus’ DICE to include productivity shocks as well as stochastic elements of the climate system. While we had earlier published some applications of DSICE, the most complete exposition and application appeared in the Journal of Political Economy in December, 2019. The JPE version is also by far the most computationally intensive paper ever written in the Integrated Assessment Modeling literature that combines climate with modern ideas in dynamic stochastic economic modeling.

Authorship

I must first clarify a detail. As the paper says, I was a coauthor in all substantive aspects. JPE made it clear that the presence of my name as an author reduced the chances of it being accepted. I was proud of what Yongyang, Thomas and I had accomplished. I wanted the paper to appear in JPE and did not want my name hurting my coauthors’ career progress. Therefore, I removed my name but continued to work on the paper even after I removed my name as an official author.

The economic questions

The economic question explored was “What is the social cost of carbon, and how does it depend on parameter assumptions?” Even though we examined a wide range of parameter specifications for Epstein-Zin preferences and the stochastic productivity process advocated by the macroeconomics literature, the range for the current social cost of capital (also, the optimal carbon tax from a world policy perspective) was $40-$100 per ton of carbon. This range includes the results of other models but contains a larger upper region due to our including economic uncertainty. The key intuition is that the loss function is convex, and increasing the variance of future temperatures will increase the social cost of carbon.

We also analyzed the impact of a stochastic tipping process, such as glacier melting leading to rising sea levels. Damages from tipping processes are different from damages related to business cycle fluctuations because, for example, the melting of glaciers is irreversible from the perspective of economic planning. Those damages are only moderately correlated with consumption. Therefore, the stochastic asset pricing kernel that DSICE implicitly computes will discount tipping point damages at a lower rate, magnifying their contribution to the SCC. More generally, we show that there is no one discount rate for climate change damages and that consumption CAPM considerations will affect the SCC.

Our analysis is a major advance in IAM models. We used the full, five-dimensional, climate model developed by Nordhaus, whereas many authors use far simpler climate models. Some assume that CO2 emissions immediately heat the atmosphere, ignoring the heating process in the atmosphere and the presence of the ocean as a heat sink. Climate scientists can use the simplified approach because they think in terms of millennia. Economists cannot ignore events at annual, or even quarterly, frequencies. We solve the dynamic programming model with one-year time periods and have checked that results are unchanged by reducing the time period. A few others have added economic risk to their models but they assume far less variance than standard macroeconomic estimates. Some have included tipping point phenomena in their models but using less realistic specifications.

Twenty-five years ago, I wrote in my book that if meteorologists used the same approach to research as economists, “they would ignore complex models … and instead study evaporation, or convection, or solar heating, or the effects of the earth’s rotation. Both the weather and the economy are phenomena greater than the sum of their parts, and any analysis that does not recognize that is inviting failure.” Our DSICE analysis shows that we now can solve models with realistic economic shocks, realistic specifications for tipping points, and the full Nordhaus climate model. Furthermore, it shows that this kind of multidimensional modeling can be done in many areas of economics.

This paper goes back several years. The code was developed by early 2012, applied to a simpler specification and deployed on a small supercomputer. Thomas Lontzek presented the first version at the 2012 Conference on Climate and the Economy organized by the Institute for International Economic Studies. Yongyang Cai presented this paper at the conference “Developing the Next Generation of Economic Models of Climate Change Conference” at University of Minnesota, September 2014. Earlier versions include Hoover economic working paper 18113 (2017)(https://www.hoover.org/research/social-cost-carbon-economic-and-climate-risk), arXiv:1504.06909 (2015) (https://arxiv.org/abs/1504.06909), NBER working paper 18704 (“The social cost of stochastic and irreversible climate change”), “DSICE: A dynamic stochastic integrated model of climate and economy” (2012) (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1992674), and “Tipping points in a dynamic stochastic IAM” (2012) (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1992660).

Verification to demonstrate accuracy of our numerical results

This paper introduces two features that help document the validity of our computational results. As many know, I do not trust anyone’s computational results, even my own. My lectures frequently use the phrase “Trust, but verify” taken from the Russian Doveryáy, no proveryáy. The JPE paper’s results relied on trillions of small optimization problems and billions of regressions. The sheer scale of the problem justifiably raises reliability questions. DSICE uses value function iteration over centuries, necessary because of the non-stationary nature of the problem. Each iteration takes the time t value function and computes the time t-1 value function at a set of points efficient for approximation and then applies regression to approximate the time t-1 value function. At each iteration, we check the quality of this approximation by computing the difference between the approximation and the true value at a random set of points in the state space. Our verification tests tell us that we have three- to four-digit accuracy for most of the important functions. This approach to verifying computational results can be applied to any computational work in economics, and help deal with the replication problems in economics. We are not aware of any other serious work which performs demanding verification tests, but strongly advocate its adoption by authors, editors, referees, and journals.

Code for reader use

Many journals encourage authors to share their code, but our code contains copyrighted material and cannot be publicly posted. Our second novel feature will help readers check the accuracy of our computations and use the computed value and policy functions for their own simulations. To that end, we created Mathematica notebooks for a few examples which contained the solutions and code that would allow the reader to check first-order conditions.

Doing the “impossible” despite fierce opposition

The kind of modeling in DSICE has often been called, and is still being called, impossible to do. Many authors indicate that they would like to examine more general models (which would still be far simpler than DSICE) but assert that it is intractable. We were able to solve the models in the JPE paper for several reasons: we used high quality numerical methods for integration and optimization, we developed basis functions that were suited to the problem, and we were able to use massive parallelization. The key novelty in this paper was using supercomputing at a scale far beyond other economics work (as far as we know).

Massive parallelism is natural for solving dynamic programming problems. We got our first experience by working with Miron Livny (developer of HTCondor), Stephen Wright and Greg Thain, all at the University of Wisconsin. Livny gave us access to a UW cluster — “unlimited access with zero priority” — which worked fine for Yongyang’s 2008 PhD thesis. When, in 2010, we began developing DSICE, we had to move to supercomputers. In November, 2012, Bob Rosner, former director of Argonne, suggested we apply for time on the Blue Waters supercomputer supported by the NSF. We had nearly continuous access to Blue Waters from March 2013 until NSF ended the project in 2019. Yongyang and I wrote the five proposals, wrote the end-of-year reports, and made the required appearances at the annual Blue Waters Symposium. Yongyang did the coding without major support from Blue Waters staff; in fact, he found a bug in their compiler. From 2013 to 2019, we received over one million node hours, where each node had between 16 and 32 cores, adding up to over 25 million core hours. Access to Blue Waters made it possible to solve the most complex example in our JPE paper with about 80,000 cores for over four hours. We owe a lot to the University of Wisconsin people who got us started and to the Blue Waters project for giving us what we needed for the applications of DSICE used in our JPE paper.

This paper proves that regular economists at any university can get access to substantial supercomputing time. Many of you will be skeptical about this claim because I am a Stanford employee in the heart of Silicon Valley. Stanford may have access to high-power computers, but those resources are controlled by individual University units. The Hoover Institution does not (nor should it) have a supercomputer for research.

Yongyang was supported by a NSF grant administered by Argonne National Labs and the University of Chicago. At one time, Yongyang and I thought we would have access to computers at Argonne and/or Chicago, but the grant’s co-PIs denied our request for access to any UC or Argonne computer.

I emphasize these facts for two reasons: to show that beginning in about 2006 we did not need special privileges nor major institutional support to get access to massively parallel computer systems, and to recognize Yongyang for his hard work and determination in getting the job done with little help.

Lessons for all economists

I have long advocated the use of modern computing hardware and algorithms in economics. Our JPE paper is just one example of what economists can do on their own. This blog will discuss other such demonstrations. I also hope that the computational science community will see that economists can be worthy research partners.

Here we go again

Last Friday, Karl Schmedders (IMD University and University of Zurich) and I placed an op-ed in The Hill: https://thehill.com/opinion/international/509991-the-real-view-from-europe-on-coronavirus-relief-in-america. We made the point that the US … [Continue reading]