I recently read, with great interest, the Resolution Foundation’s “Recession ready? Assessing the UK’s macroeconomic framework”. Written by James Smith and three co authors, I was in the mood for a broad and solution-led survey of how monetary, fiscal, and structural policy might be reformed. But it shouldn’t take 113 pages to say “when monetary policy becomes ineffective use fiscal policy instead”. I know that I’m being a little harsh to suggest that’s all they do, but I can’t help but feel that the Keynesian conclusion can be readily anticipated from the Keynesian premise, which is “what tools do we need to escape a future recession?” The alternative approach would be to ask “what macroeconomic framework will prevent recessions from occurring” - a deeper question, and one which may well lead to different answers.
I totally accept that low interest rates have caused a major problem for the Bank of England, but only to the prevailing approach of inflation targeting. An attack on the status quo isn’t an effective criticism of monetary policy more generally. The report pays lip service to a higher inflation target, and more systematic QE (something I approve of), but seems keen to dismiss those as minor tweaks to an approach that didn’t adequately draw attention to fiscal policy. But demonstrating that different fiscal policy improves on current monetary policy is not a fair fight.
(As an indication of the pernickety way in which I read the report, Figure 3 reveals that “Recessions always result in falling GDP and rising unemployment”, but a recession is falling GDP. Also, Figure 32 claims to show that “tax receipts have fallen since the financial crisis”, but given that it shows receipts as a proportion of GDP, it could also label this as “GDP has risen”.)
To consider how to fix monetary policy, such that we’re not left with fiscal policy as a last resort, I recommend “Facts, Fears and Functionality of NGDP Level Targeting: A Guide to a Popular Framework for Monetary Policy”, written by David Beckworth and published by the Mercatus Center. This is the paper I’d been hoping he’d write for some time, covering the basic idea behind a nominal GDP target, as well as dealing with some common criticisms.
The first criticism he deals with is how to respond to changes in the potential real GDP growth (Y*). If we set an NGDP target of 4% (let’s say we want ~2% inflation and believe the long term growth rate of real GDP is 2%), then what happens if a productivity slowdown reduces Y* to 1%? Beckworth points out that there’s two options. The first is to recalibrate the NGDP target, e.g. by bringing it down to 3%. But he also mentions George Selgin’s preference, which is to allow the trend inflation rate to change instead. I see three main reasons to favour the latter: (i) by permitting inflation to rise to 3% you are utilising the price systems ability to signal information, rather than be used as an arbitrary means to shape expectations; (ii) a little inflation volatility is tolerable given that by adopting an NGDP target you’ve already decided that nominal income stability is more importance than price stability; and (iii) in this situation you’d need to adjust an inflation target anyway.
The second criticism is that data revisions make NGDP targets impractical. I think this is a valid concern - CPI data is more immediate and reliable than GDP figures - and I don’t find Beckworth’s reassurances totally convincing. His first counterpoint is that a Taylor rule also requires an estimate of GDP (and potential GDP to boot), but this is precisely why many policymakers avoid using it. Certainly in a UK context I believe MPC members place much more weight on the latest CPI figures than on their impression of the output gap. Beckworth’s second counterpoint is that there are ways to improve our real-time estimates. He suggests using income data, or high-frequency nowcasting (such as that used by the Atlanta Fed), or credit card companies payments systems. I’m intrigued by this, and looked into payment data in my book on alternative monetary indicators (see chart below). But whenever I read people getting excited by the possibilities of big data my Austrian instincts kick in and I question whether a lack of data, or computing power, has ever been the constraint on central planning? And even if big data would help, it isn’t ready yet. And I want an NGDP target now! His third counterpoint is my favoured approach, which is using markets rather than computing power - in particular Scott Sumner’s proposal for an NGDP futures market. But it’s never been clear to me how such a futures market would deal with revisions to past GDP figures. (It seems to open the door to policy changes in response to revisions to past data, but this may just be a downside of any level target.)
The third criticism that Beckworth responds to is that the public won’t understand NGDP targets, and I like Beckworth’s framing of a “dollar income growth” as the improvement over a “cost of living index”. But this does cut against the political difficulty of having to wind back an overshoot. In the same way that it’s difficult to sell a need for higher prices to the public during a downturn, it would not be popular to put the breaks on wage growth when the economy is strong.
Finally, Beckworth made a strong case in favour of the financial stability angle of NGDP targets. As he says,
the countercyclical inflation created by an NGDP level target will cause real debt burdens to change in a procyclical manner
This is a big improvement on the status quo. Currently, a recession means that prices fall and the real debt burden on debtors is increased. Under NGDP targets a downturn will lower the real debt burden and share some of those losses with the creditor. Beckworth points to empirical literature showing this risk sharing is useful in a world of incomplete financial markets.
Two bold papers, both worth reading.