MAIN THEME
As the name of the articles indicates, the author has illustrated that macroeconomics has gone backwards for the last three decades. Now-a-days, economists attribute fluctuations in aggregate variables to imaginary forces and not to the actions of human beings. He answered two questions;
- How do macroeconomists disregard the problem of identification?
- Why have macroeconomists or economists as a whole forgotten their duty?
In the first five parts, he tried to answer the first question and in the remaining last parts, he tried to answer the second question. Let us begin by explaining the first question.
The author explained with the help of an historical example of how monetary factors are important for aggregate economic activity. The episode of Volcker deflation in American history is a very good illustration of how monetary policy matters. Federal Reserve has a direct control over the monetary base and thus can change the base by buying and selling securities. When one bank borrows reserves from another, it pays the nominal federal funds rate. If the Fed makes reserves scarce, this rate goes up. The best indicator of monetary policy is the real federal funds rate_ the nominal rate minus the inflation rate. So when Paul Adolph Volcker (who is an American economist and Chairman of Fed under presidents Jimmy Carter and Ronald Reagon and he is widely credited with ending the high inflation seen in the US during 1970s and early 1980s) took office as a chairman of Fed, announced a prompt increase in the Fed Funds rate in order to dampen the inflationary forces in the economy. This caused the real interest rates to increase while decreasing the output and increasing unemployment. The rate of inflation fell, either because the combination of higher unemployment and a bigger output gap caused it to fall or because the Fed’s actions changed expectations.
Now the only way to remain faithful to the dogma that monetary policy is not important is to argue the Fed did not change the Federal Funds rate and it was an imaginary shock that increased it at just the right time and by just the right amount to fool people at the Fed into thinking that they were the ones moving it around.
Next we see how with the launching of Real Business Cycle (RBC) model, macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take. The RBC theory makes the fundamental assumption that an economy witnesses all these phases of business cycle due to technology shocks rather than monetary shocks or changes in expectations. RBC model relies on two identities; the first one is given as,
Δ%A = Δ%Y – Δ%X
This defines the difference between the growth of output Y and growth of an index X of inputs in production. Now what is this term Δ%A. Abromovitz (1956) famously referred to this residual as the “measure of our ignorance”. The author ridicules by referring to this as “Phlogiston”.
The second identity defining the RBC model is the quantity theory of money.
V = YP/M
According to the quantity theory of money, given output Y, the only effect of a change in the monetary aggregate M is the proportional change in the price level P. Thus, in this model the effects of monetary policy are insignificant. Now the proponents of RBC model cite its microeconomic foundations as one of its main advantages. But they do not give any microeconomic evidence for the negative phlogiston shock (Δ%A).
Extensions to the RBC core in the form of DSGE models produces further mess and the proponents attributed different changes to different types of imaginary forces which the author named as ‘troll’, ‘gremlin’, ‘aether’, ‘caloric’, etc.
To allow for the possibility that monetary policy could matter, DSGE modelers propose the prices to be sticky which allowed for the possibility that monetary policy can affect output. But in this case again the results were not much far away from the RBC dogma. If monetary policy matters at all, it matters very little.
THE IDENTIFICATION PROBLEM
The identification problem means that to get results, an econometrician has to feed in something other than data on variables in the simultaneous system. The author refers to the things that get fed in as facts with unknown true value (FWUTV). The current practice in DSGE econometrics is to feed in some FWTUVs by “calibrating the values of some parameters or to feed in other tight Bayesian priors.” The Smets and Wouters (SW) model was applied to the data from United States for the years that include the Volckers Deflation and the results conclude:
.…. monetary policy shocks contribute only a small fraction of the forecast variance of output at all horizons.
….. monetary policy shocks account for only a small fraction of the inflation volatility.
So what matters in the model is not money but the imaginary forces. A modelling strategy that allows imaginary shocks and hence more variables makes the identification problem worse.
The identification problem is further explained by the author using the simplest supply and demand model of labor market. He tried to find out the effect of any policy change on the supply and demand of labor. For that it was necessary to calculate the elasticities of labor demand. He tried to estimate the underlying curves of the labor demand and labor supply using only the data. But the software gave an error. Then he fed in a FWUTV by imposing the restriction that supply curve is vertical and he got the results. He fed in a different FWUTV by imposing the restriction that the supply curve passes through the origin and again he got the results. Now the question arises which one of these FWUTVs is true. Now it is confirmed that one of them has to be false but which one? The estimates do not tell anything about this. So each of these models is meaningless. One thing which the author explained is that usually in a simultaneous system, the number of parameters to be estimated are usually more than the number of equations. Thus in that case FWUTVs have to be fed up. In order to solve this problem of identification, Lucas and Sargent proposed that rational expectations would solve the issue. For that a term for the rational expectations has to be added. But now the number of parameters has become twice of what they were before, while the number of equations were the same. So allowing for the possibility that expectations influence behavior makes the identification problem at least twice as bad.
REGRESS IN THE TREATMENT OF IDENTIFICATION
According to the author the problem of identification has not been given careful attention now-a-days. They are still relying on feeding FWUTVs. Some of the ways followed to solve the problem of identification are;
NATURAL EXPERIMENT
The method used by Friedman and Schwartz in order to estimate the elasticity of labor demand was to look for two adjacent periods in the history which were similar in any other way except for a change that shifts the labor supply curve in one period relative to the other. F it was possible to find out such a pair, they would ignore all the other data points and base the estimate on just that pair. The Friedman and Schwartz approach feeds in a fact with truth value that others can assess and thus opens up the results to criticism and revision.
IDENTIFICATION BY ASSUMPTION
In order to solve the problem of m2 parameters from m equations, FWUTVs are fed in for many of the parameters, mainly by setting them equal to zero. But there is no evidence to assess the true value of FWUTVs.
IDENTIFICATION BY DEDUCTION
Mathematical deduction was one of the other ways suggested by Lucas and Sargent (1979) to solve the problem of identification. But math cannot establish the true value of the fact, it never has and it never will. With enough math, an author can confident that most readers will never figure out where a FWUTV is buried. A discussant or referee cannot say that an identification assumption is not credible if they cannot figure out what it is and are too embarrassed to ask.
IDENTIFICATION BY OBFUSCATION
The author replicated the results from the Smets and Wooters (2007) model using the software ‘Dynare’ which the author used. The User’s Guide of the software package says that the inclusion of priors helps identifying parameters. From this the author came to know apart from identification by deduction and calibration, setting up tight priors also is a way to achieve identification in the DSGE models. Onatski and Williams (2010) show that if you feed different priors into an earlier version of the Smets and Wooters model, you get back different structural estimates. So this whole discussion means that it is not the data which is informative about the estimates but the prior. Even if one has available an infinite sample of data, any inference about the demand elasticity is coming exclusively from the prior distribution (Baumeister and Hamilton, 2015).
Now let’s move to discuss the question which the author has addressed in the last sections of the paper; why have economists as a whole forgotten their duty?
One example of a meta question is why macroeconomists started invoking imaginary driving forces to explain fluctuations. Another is why they seemed to forget things that have been discovered about the identification problem. The author mentions some characteristics of the economists presented by Smoln (2007) which is causing the above two things to happen. These characteristics are;
- Tremendous self-confidence
- An unusually monolithic (large, powerful, rigid) community
- Loyal to a certain group with religious faith
- Lack of appreciation for risky research programs
- Groupings among one group of experts and other group of experts
- A disregard for ideas, opinions and work of experts who are not part of the group
Admiration of a certain group of leaders evolves into deference to these leaders. Deference leads to effort along the specific lines that the leaders recommend. It is true that guidance from authority can align the efforts of many researchers, but conformity to the facts is no longer needed as a co-ordinating device. Eventually evidence stops being relevant. Progress in the field is judged by the purity of its mathematical theories, as determined by the authorities.
CORROSION OF THE NORMS OF SCIENCE
It is not true that self-interest is a threat to science. People are always motivated by self-interest. Science is a social system that uses competition to direct the self-interest of the individual to the advantage of the group. The problem is that competition in science, like competition in the market, is vulnerable to corrosion.
Bob Lucas, Ed Prescott and Tome Sargent led the development of post-real macroeconomics. Prior to 1980, they made important scientific contributions to macroeconomic theory. They shared experience which developed a bond of loyalty that would be admirable and productive in many social contexts. But this loyalty introduced biasedness into science.
An example of corrosive loyalty is that Lucas in his 1995 Nobel Lecture discussed the importance of monetary policy for macroeconomic theory but in 2003 Presidential Address to the American Economic Association, he gave a strong endorsement to Prescott’s claim that monetary economics was a triviality. The only possible explanation which can be given for the strong claims that Lucas makes in his 2003 lecture relative to what he wrote before and after is that in the lecture, he was doing his best to support his friend Prescott.
A second example of arguments that go above and beyond the call of science is the defense that Tom Sargent offered for a paper of Lucas (1980) on the quantity theory of money. Lucas estimated a demand for nominal money and found that it was proportional to price level as the quantity theory predicts. He actually used a way to filter the data to get the quantity theory results. Sargent and Surico (2011) revisit Lucas’s approach and show that when it is applied to data after Volcker deflation, his method yields a very different result. But they supported Lucas results by saying that the change could arise from a change in the money supply process. They also represent the meaning of the comment Lucas makes that there are conditions in which the quantity theory might break down.
The simplest way to describe their result is to say that using Lucas’s estimator, the exponent on the price level in the demand for money is identified only under restrictive assumptions about the money supply.
BACK TO SQUARE ONE
The author is saying that macroeconomics is back to square one since macro models make assumptions that are no more credible and far more opaque. When the person who says something that seems wrong is a revered leader of a group with the characteristics mentioned before, there is a price associated with open disagreement. All this situation creates a real trouble ahead for all economics. The trouble is not so much that macro economists say things that are inconsistent with the facts. The real trouble is that other economists do not care that macroeconomists do not care about the facts. An indifferent tolerance of obvious error is even more corrosive to science than committed advocacy of error.