Winner of the New Statesman SPERI Prize in Political Economy 2016


Tuesday 29 October 2013

Politicians are (not) all the same

Chris Giles wrote about a month ago that “Britain will not have much of a choice at the 2015 election. However much they talk about clear differences, the parties have rarely been closer on economics”. He will probably hate me for saying this, but I was reminded of his article when I watched this interview between Jeremy Paxman and Russell Brand. The common theme, which Chris Dillow also picks up, is that the current political system offers no real choice.

This theme, common on the left, has a long pedigree. I remember being told to stop being exercised by hanging chads in 2000 because a Gore presidency would be very much like a Bush presidency. This idea is clearly ludicrous if we look at US politics today. But does it apply to the UK? I’m afraid I’m going to be very unforgiving. It either represents naivety or indulgence.

Chris Giles has grounds for his view, in that on issues like austerity Labour are trying hard not to appear very different from the current coalition. On the other hand, if you are struggling to ‘pay the bedroom tax’, Labour’s commitment to abolish it could make a big difference to your life. He may also be right that the actual content of Ed Miliband’s conference proposals on energy and housing are modest, and hardly the return to full bloodied socialism that some on the right hysterically proclaim. The governments ‘Help to Buy’ scheme is much more likely to involve a prolonged period of government intervention in a market. But I think if you were to conclude from this that a Labour government after 2015 would have a similar economic policy to a Conservative government, you would be being very naive.

Consider two big dividing lines between left and right on economic policy: the size of the state and the distribution of income. On the first, there are strong arguments that the current government’s austerity programme is not so much about the perils of high debt but a deliberate attempt to roll back the size of the state. Is it really likely that Labour would continue that policy if it was elected? It is much more likely that we would see a repeat of what happened under the last Labour government: an initial period of sticking to inherited plans to demonstrate prudence, and then a programme of real growth in areas like health and education. On the second, as I outlined here, the current government’s policies will lead to a significant increase in poverty over the next decade. When it was last in power, Labour tried very hard to achieve the opposite (although I agree it was much more concerned about poverty than inequality). Is it really likely that Labour will behave quite differently, and much more like the Conservatives, if they regain power?

Now you could argue that the financial situation of the government will remain so dire after 2015 that any government will be forced to keep cutting spending and welfare. Maybe. However I think it is more likely that the economic recovery will turn out to be much stronger than currently forecast, and that the OBR will revise up their estimate of potential output as this happens. This will create 'fiscal space'. If this occurs under the Conservatives, I would put my money on significant tax cuts, while under Labour we will see many of today’s cuts in spending and welfare reversed.

Another way of making the same point is that it is naive to believe politicians when they set out their political programmes. In a two party system within the framework of a simple left/right scheme, it may be optimal as an opposition to position yourself just to the side of your opponent, as long as this does not alienate your core vote. Once you regain power you can revert to type. (Remember Cameron’s compassionate conservatism before the last election.) The problem with that dynamic is that it may lead to the appearance that ‘all politicians are the same’ as we move towards an election, which may discourage some ‘rationally naive’ potential voters (those who are not too interested in politics) from voting. (It may also generate such a negative view of politicians that it leads otherwise sane people into rather silly positions.)

It is clear that Russell Brand is not disinterested in politics, so he should not be so naive. He seems pretty passionate about issues like equality and climate change, so it seems blindingly obvious to me who he should vote for. So why does he appear to encourage others not to vote? The argument of the true revolutionary is that anything that makes the current system more palatable just delays the revolutions eventual triumph. But that need not be what is going on here. Instead it could be a reluctance to be associated, however mildly, with a political party that is far from your political ideal (even though it is not quite as far from your ideal as the others). The number of times I have heard someone say: ‘Even though I hate party B, I couldn’t possibly vote for A because of their position on X’. But as I have argued above, the gap between parties A and B (and C) can make a significant difference when one gains power. So to refuse to vote for A because it makes you feel somehow complicit in the aspects of A’s platform you do not like seems to me just personal indulgence.

This is not to dispute that many like Brand or Dillow feel that we require much more radical change than is offered by mainstream politics. They should continue to use the media to promote that view when they can. But for people like them, working out which political party is the least bad is fairly costless. Using this knowledge to vote, and making this knowledge public, does not compromise their more radical views, and it could help make a significant difference to many peoples’ lives. 


Monday 28 October 2013

The ‘official’ cost of austerity

Well, not quite, but probably as close as we will ever get.[1] In a new paper, Jan in‘t Veld uses the European Commission’s QUEST model to estimate the impact of fiscal consolidation in the Eurozone (EZ) from 2011 to 2013. The numbers in the table below include spillover effects from other EZ country fiscal consolidations, so they are best interpreted as the impact of overall EZ fiscal consolidation over this period. There are at least two important things to note about the exercise. First, they do not attempt to analyse the impact of the particular mix between cuts in spending and increases in taxes applied in each country. Instead the ‘input’ is simply the change in the general government primary structural balance each year, which is assumed to be equally balanced between expenditure and revenue measures. (More on this below.) Second, to a first approximation this fiscal consolidation is assumed to lead to no change in short or long term real interest rates during the 2011-13 period.

GDP losses due to Eurozone fiscal consolidation (including spillovers) 2011-13. Source European Economy Economic Papers 506, Table 5.

Impact on GDP 2013
Cumulative Impact 11-13
Germany
3.9%
8.1%
France
4.8%
9.1%
Spain
5.4%
9.7%
Ireland
4.5%
8.4%
Greece
8.1%
18.0%

Of course many would argue that had countries like Spain or Greece not undertaken this degree of austerity, long term interest rates might have been even higher than they actually were. (Perhaps short rates might have also been higher, if with stronger growth the ECB had raised short rates, but remember that tax increases also helped raise EZ inflation.) However a significant amount of fiscal consolidation took place in Germany, and this had significant spillover effects on other EZ countries. It is difficult to see why that consolidation was required to ease funding pressures.

One slightly surprising aspect of the exercise has already been noted. To quote: “As detailed information about the composition of the actual consolidations is not available, it is assumed the composition is equally balanced between expenditure and revenue measures.” As other institutions like the IMF publish exactly that kind of information, I’m puzzled. What the paper does report is that QUEST shows that consolidation implemented through spending cuts has about twice the short run multiplier as consolidation through higher taxes, but of course this is exactly what theory would suggest. In a forward looking model like this it also matters a great deal how agents perceive the permanence of these policy changes.

Of course QUEST is just one DSGE model, which just happens to be maintained by the Commission. An earlier study (pdf) by Holland and Portes at NIESR had important differences in detail, but the bottom line was similar: EZ GDP was 4% lower in 2013, and the cumulated GDP loss was 8.6%. These numbers are of course large, and so it is quite reasonable to say that the proximate cause of the second EZ recession is simply austerity.

Now many would argue that much of this was forced by the 2010 crisis. There seems to be a mood of fatalism among many in Europe that this was all largely unavoidable. I think that is quite wrong. Some fiscal tightening in Greece was inevitable, but if EZ policy makers had taken a much more realistic view about how much debt had to be written off, we could have avoided the current disaster. What ended the EZ crisis was not austerity but OMT: if that had been rolled out in 2010 rather than 2012, other periphery countries could also have adjusted more gradually. And of course fiscal consolidation in Germany and some other core countries was not required at all. If instead we had seen fiscal expansion there, to counter the problem of hitting the ZLB, then the overall impact of fiscal policy on EZ GDP need not have been negative. (Section 5 of Jan in‘t Veld’s paper looks at the impact of such a stimulus.) That means that over 3 years nearly 10% of Eurozone GDP has been needlessly lost through mistakes in policy. This is not the wild claim of a mad macroeconomist, but what simple analysis backed up by mainstream models tell us.

One final point. The UK equivalent to these ‘official’ numbers are the OBR’s estimates of the impact of fiscal consolidation on the UK. While they are significant in size, they are smaller than these EZ numbers. The OBR estimate that UK GDP in 2013 is about 1.5% lower as a result of fiscal consolidation, and the cumulated GDP loss due to fiscal tightening from 2010 to 2013 is a bit above 5%. There is a good reason and a bad reason for this difference. First, the UK is a more open economy than the EZ as a whole, and we would expect openness to cushion the impact of fiscal consolidation. Second, the OBR’s numbers are more crudely derived, based on multipliers that take no account of the zero lower bound or deep recession. The numbers from the QUEST model allow for both, which of course raises the size of the fiscal impact. 


[1] As the disclaimer says, “The views expressed are the author’s alone and do not necessarily correspond to those of the European Commission.” A good example of the official line is Buti and Carnot, but this line does not tend to be backed up by model simulations, which I think is revealing. 



Saturday 26 October 2013

Rational expectations, the media and politics

As those of you who have read a few of my posts will know, on the occasion that I venture into political science I like to push the idea that the attitudes and organisation of the media are an important part of trying to understand the political dynamic today. (See for example here and here, but also here.) To put it simply, the media help cause changes in public opinion, rather than simply reflect that opinion. Yet, if you have a certain caricature of what a modern macroeconomist believes in your head, this is a strange argument for one to make. That caricature is that we all believe in rational expectations, where agents use all readily available information in an efficient way to make decisions. If that was true when people came to form political opinions (on issues like immigration, or crime, for example), then information provided by media organisations on these issues would be irrelevant. In the age of the internet, it is fairly easy to get the true facts.

Some who read my posts will also know that I am a fan of rational expectations. I tend to get irritated with those (e.g. some heterodox economists) that pan the idea by talking about superhuman agents that know everything. To engage constructively with how to model expectations, you have to talk about practical alternatives. If we want something simple (and, in particular, if we do not want to complicate by borrowing from the extensive recent literature on learning), we often seem to have to choose between assuming rationality or something naive, like adaptive expectations. I have argued that, for the kind of macroeconomic issues that I am interested in, rational expectations provides a more realistic starting point, although that should never stop us analysing the consequences of expectations errors.

So why do I take a different view when it comes to the role of the media in politics? The answer simply relates to the costs and benefits of obtaining information. If you are trying to think about how consumers will react to a tax cut, or how agents in the FOREX market make decisions, you are talking about issues where expectation errors will be costly to the individual agents involved. So there are benefits to trying to gather information to avoid those mistakes. Compare this to political issues, like whether the government should be taking action over climate change. What are the costs of getting this wrong for the individual? Almost negligible: they may cast their vote in the wrong way. Now for society as a whole the costs are huge, but that is not the relevant thought experiment when thinking about individual decisions about whether to be better informed about climate change. Most people will reason that the costs of being better informed are quite high relative to the expected benefit, because the impact of their vote on the actual outcome of an election is negligible. [1]

Which is why, as Paul Krugman often reminds us, most people do not spend much time (on the internet or elsewhere) gathering information about issues like climate change, crime or immigration. That is a rational decision! They do, however, engage with media for other reasons, and are therefore likely to pick up information from there at little cost. So if the media distorts information, it matters.

That is my a priori conjecture, but what about evidence? Take opinions about climate change in the US. As this study (pdf) shows, a distressingly large proportion (45%) of those polled thought that there is “a lot of disagreement among scientists about whether or not global warming is happening”, whereas in fact there is near unanimity among scientists. Now you could I suppose argue that this misperception had nothing to do with Fox News or talk radio, but just reflected the fact that people wanted to believe otherwise. But that seems unlikely, as you could more easily believe that although climate change was happening, the costs of doing anything about it outweighed the benefits. Certainly those institutions dedicated to climate change denial think beliefs about the science are important.    

Here in the UK is a survey that Ipsos MORI conducted for the Royal Statistical Society and King’s College London (HT Tim Harford). The survey highlights the misperceptions they found, and in some cases errors were huge. To give two examples, the public think that £24 out of every £100 spent on benefits is claimed fraudulently, compared with official estimates of £0.70 per £100, and people think that 31% of the population are immigrants, when the official figure is 13%. In contrast, estimates of the number of people who regularly read a newspaper, or had a facebook account (where people probably had to draw on their own experience rather than stories in the media), were much more accurate.

These surveys certainly suggest that people’s views on at least some key issues are based on perceptions that can be wildly inaccurate. The UK survey also suggests there is an understandable tendency to overestimate things that are ‘in the news’: the level of unemployment was overestimated (pdf) by a factor of 2 or 3, the number of UK Muslims by a factor of 4 or 5, whereas the estimated proportion of those living in poverty was pretty close to the true figure. But it is also striking that the really wild misperceptions were on issues that tend to receive disproportionate tabloid coverage: apart from the benefit fraud example quoted above, we have

“people are most likely to think that capping benefits at £26,000 per household will save most money from a list provided (33% pick this option), over twice the level that select raising the pension age to 66 for both men and women or stopping child benefit when someone in the household earns £50k+.  In fact, capping household benefits is estimated to save £290m, compared with £5bn for raising the pension age and £1.7bn for stopping child benefit for wealthier households.”

One final point. Some of the comments on my recent post on this issue said, in effect, how typical of those on the left [2] to think that people who hold views they don’t like must have been brainwashed. But of course there are plenty on the right (almost certainly more than on the left) who spend a lot of their time complaining about media bias the other way. The refrain about liberal bias in the US media is ubiquitous, and in the UK it is mainly right wing think tanks and politicians who go on about BBC bias. And if you think that is because the BBC is biased (towards Labour, Europe etc), then unfortunately the facts suggest otherwise, as Mike Berry outlines here. In fact, if you are looking for people who honestly believe the media is not that important politically, I suspect you will find more of them on the left than the right. But wherever they come from, I think they are mistaken.


[1] Of course elections are fought over many issues, which just reinforces this point. People are also increasingly likely to be apathetic about the political process, often because ‘all political parties seem the same’. I want to talk about this view in a subsequent post.


[2] I should note that on this blog I have never said how I vote, or advised others to vote in any way. I try to either focus on the macroeconomics (and criticise politicians only when they get this wrong), or to focus on understanding political trends when I stray beyond economics. I have no problem with others doing political advocacy, as long as they are honest about it, but it is not my comparative advantage so I try and avoid it. I have of course been highly critical of the current coalition’s macro policy, but if it was a Labour government undertaking austerity (as it might have been) I would be just as critical of them. If you think I’m to the left because (a) I think policy should be evidence based, or (b) because I do not like the fact that current government policy is knowingly raising UK poverty, and (c) because I think climate change is a critical problem, then all I would say is that either you are being unfair to the political right, or that this says something really worrying about where the right is just now. 

Friday 25 October 2013

In defence of forward guidance

Although this post is prompted by the bad press that the Bank of England’s forward guidance has been getting recently, much of what I have to say also applies to the US, where the policy is very similar. But there is one criticism of the policy in both countries that I agree with which I will save that until the end.

A good deal of the criticism seems to stem from a potentially ambiguity about what the policy is designed to do. The policy could simply be seen as an attempt to make monetary policy more transparent, and I think that is the best way to think about it in both countries. However the policy could also be seen as a commitment to raise future inflation above target in an attempt to overcome the ZLB constraint as suggested by Michael Woodford in particular [1]. Let’s call this the Woodfordian policy for short. (John Cochrane makes a similar distinction here.)  Ironically the reason why it helps with transparency is also the reason it could be confused with the Woodfordian policy.

In an earlier post written before the Bank of England unveiled its version of forward guidance, I presented evidence that might lead those outside the Bank to think that it was just targeting 2% inflation two years out. We could describe that as the Bank being an inflation forecast nutter, because it gave no weight to the output gap 2 years out. An alternative policy is the conventional textbook one, where the Bank targets both inflation and the output gap in all periods. I suggested that if the Bank published forward guidance, this could clearly establish which policy it was following. It has and it did: we now know it is not just targeting 2% inflation 2 years out, because it says it will not raise rates if forecast inflation is expected to be below 2.5% and unemployment remains above 7%. 2.5% is not hugely different from 2%, but in the world of monetary policy much ‘ink’ is spilt over even smaller things.

So forward guidance has made things clearer, as long as you do not think monetary policy is trying to implement a Woodfordian policy. Unfortunately if you really believed a central bank was an inflation forecast nutter, then you could see forward guidance in Woodford terms. Now I think there are good arguments against this interpretation. First, 2.5% is an incredibly modest Woodfordian policy. Second, for the US, there is a dual mandate, which would seem to be inconsistent with being an inflation forecast nutter. Third, for the UK, the MPC has made it pretty clear (most recently here) that it is not pursuing a Woodfordian policy. For all these reasons, I think it is best to see forward guidance as increasing transparency.

For those who just want to know whether monetary policy changes represent stimulus or contraction, this can be confusing. We saw this in the UK with the questioning of Mark Carney by the Treasury Select Committee, and Tony Yates has pursued a similar theme. Mark Carney kept saying that the stance of monetary policy is unchanged, but forward guidance makes monetary policy more ‘effective’. Now you could spend pages trying to glean insights into disagreements among the MPC from all this, and I do not want to claim that the Bank is always as clear as it might be here. However it seems to me that if the Bank wants for some reason to call reducing uncertainty increasing effectiveness, then there is nothing wrong with what Carney is saying. Forward guidance helps agents in the economy understand how monetary policy will react if something unexpected happens. In particular, growth could be stronger than expected (UK third quarter output has subsequently increased by 0.8%), but the decline in unemployment could remain slow (it fell from 7.8% to 7.7% over the last three months). Charlie Bean makes a similar case here. [2]

In much of the UK media forward guidance has been labelled a failure because longer term interest rates went up as forward guidance was rolled out. Now if you (incorrectly) see forward guidance as a Woodfordian policy, you might indeed be disappointed that long rates went up (although you would still want to abstract from other influences on rates at that time). However if it is about clarifying monetary policy in general, no particular movement in long rates is intended.

Ironically, some of the apparent critics of forward guidance in the UK, like Chris Giles here, also think the Bank of England could be much more transparent in various ways. The most comprehensive list is given by Tony Yates, and I agree with much of what he says. But we all know that central banks are very conservative beasts, and do things rather gradually. So any improvement in transparency is going to be incremental and slow. When they do happen, in this case through forward guidance, they should be welcomed rather than panned. Criticising innovation by central banks risks fuelling their natural conservatism.

This suggests that the major weakness with forward guidance is that it does not go far enough. In the current context, as events in the US have shown, the major problem is that it applies only to interest rates and not to unconventional monetary policy. This allowed the market to get very confused about what the Fed’s future intentions about bond buying are. So why not welcome forward guidance by saying can we have more please.  


[1] Gauti B. Eggertsson & Michael Woodford, 2003. "The Zero Bound on Interest Rates and Optimal Monetary Policy,"Brookings Papers on Economic Activity, Economic Studies Program, The Brookings Institution, vol. 34(1), pages 139-235. See also Krugman, Paul. 1998. “It’s Baaack! Japan’s Slump and the Return of the Liquidity Trap.” BPEA, 2:1998, 137–87.
[2] If you wanted to be pedantic, you could argue that to the extent that some thought growth might be faster than expected, and that this would lead to higher rates even if unemployment remained high, then dispelling this particular possibility must mean that averaged across all states of the world policy has become more stimulatory. Carney might not want to acknowledge that because some on the MPC would get upset. My reaction to this would be, why do you want to be pedantic. 


Wednesday 23 October 2013

What is wrong with the USA?

A lot of US blog posts have asked this after the US government came very close to self-inflicted default. It was indeed an extraordinary episode which indicates that something is very wrong. All I want to suggest here is that it may help to put this discussion in a global context. What has happened in the US has of course many elements which can only be fully understood in the domestic context and given US history, like the enduring influence of race, or cultural wars. But with other, more economic, elements it may be more accurate to describe the US as leading the way, with other countries following.

Jared Bernstein writes “The US economy has left large swaths of people behind.  History shows that such periods are ripe for demagogues, and here again, deep pockets buy not only the policy set that protects them, but the “think tanks,” research results, and media presence that foments the polarization that insulates them further.” Support for the right in the US does appear to be correlated with low incomes and low human capital. Yet while growing inequality may be most noticeable in the US, but it is not unique to it, as the chart below from the Paris School of Economics database shows. Stagnation of median wages may have been evident for longer in the US, but the recession has led to declining real wages in many other countries. Partly as a result, we have seen ‘farther right’ parties gaining popularity across Europe in recent years.



Yet surely, you might say, what is unique to the US is that a large section of the political right has got ‘out of control’, such that it has done significant harm to the economy and almost did much more. If, following Jurek Martin in the FT, we describe business interests as ‘big money’, then it appears as if the Republican party has been acting against big money. Here there may be a parallel with the UK which could be instructive.

In the UK, David Cameron has been forced to concede a referendum on continued UK membership of the European Union, in an attempt to stem the popularity of the UK Independence Party. Much of UK business would regard leaving the EU as disastrous, so Cameron will almost certainly recommend staying in the EU. But with a fiercely anti-EU press, and a divided party, he could well lose a referendum. So the referendum pledge seems like a forced concession to the farther right that entails considerable risks. Chris Dillow notes other areas where a right wing government appears to be acting against ‘big money’.

While hostility to immigration has always been a reaction to economic decline, it is difficult to argue that hostility to the European Union is a burning issue for the majority of people in the UK. So why was Cameron forced to make such a dangerous concession over the referendum? One important factor is that the EU is a very important issue for all of the right wing press, which is universally antagonistic in its reporting. Murdoch’s hostility is well documented, which suggests the press is leading rather than reflecting popular mistrust. So while a right wing press is generally useful to the Conservative Party, in this particular case it seems to be pushing it in a direction ’big money’ does not want to go.

Most discussion of the Tea Party seems to view Fox News and talk radio as simply a mirror to a phenomenon that must be explained. However in the case of the EU and the UK press it seems causality runs the other way: it is the press that helps fire up the passion of a minority and the attitudes of a wider majority. When I see lists of influential people within the Republican Party, names like Rush Limbaugh, Matt Drudge and Glenn Beck seem to figure prominently. Perhaps in both the US and UK, ‘big money’ and those on the centre right need to ask themselves whether - in enabling and encouraging a highly partisan and emotive media - they have helped create something they can no longer control.

Postscript 25th October

I would not normally bother with responses to my post like this, but Mr. Bourne has a line which might for a fleeting moment sound convincing. He says that my post insults eurosceptics like him, because by suggesting that the influence of the press is strong on this issue, I must be assuming that he, other eurosceptics and perhaps all the British people are stupid. (It also shows I despise democracy and the nation state, apparently, but let’s ignore that.) This struck me as odd, because I can be pretty eurosceptic at times, so I must be insulting myself!

This is nonsense, because it confuses intelligence with information. People’s views are influenced by the information they receive (mine certainly are), and therefore it is important that this is not one sided. My concern with much of the UK press is that on many issues people are getting very distorted information. Now Mr. Bourne thinks that in worrying about that, I am implying that people are stupid. But wait a minute. Mr. Bourne’s employer, the CPS, recently released a report criticising the BBC for bias. If they think that is important, and they seem to, does that not mean they also think people are stupid?


I suspect Mr Bourne chose to be insulted so he could have a good old eurosceptic rant. Which is fine, but please do not make completely unjustified assertions about what my views are at the same time. That is insulting. 


  

Monday 21 October 2013

Is a currency crisis bad for you at the ZLB?

My and Paul Krugman’s comments on Ken Rogoff’s FT piece have generated an interesting discussion, including another piece from Ken Rogoff, a further response from Paul Krugman, and posts from Brad DeLong, Ryan Avent in the Economist, Matthew Klein, John McHale (follow-up here), Nick Rowe and Tony Yates. The discussion centred on the UK, but it is more general than that: a similar thought experiment is if China sells its US government debt. Paul Krugman has promised more, so this may be the equivalent of one of those annoying people in a seminar audience who try and predict with questions what the speaker will say.  

The track I want to pursue starts from my argument that Quantitative Easing (QE) will ensure that the UK government never runs out of money with which to pay the bills. The obvious response, which Paul Krugman’s comment took up, is that a market reaction against UK government debt might be accompanied by a reaction against the UK’s currency, leading to a ‘sterling crisis’. For those with a long memory of UK macro history, would this be 1976 all over again?

Ken Rogoff’s original article, and his reply, both focus on the trigger for these events being a collapse in the Euro. While such risks should be taken seriously (although, for the record, I have never thought such a break up was the likely outcome), I want to ignore this possibility here, simply because it introduces too many complicating issues. Instead I just want to look at a much more limited scenario. Specifically, suppose Labour had not lost the election, and austerity had been delayed (by more than Ken Rogoff would have thought wise, bearing in mind that he agrees that government ‘investment’ in its widest sense was in fact cut too aggressively). Suppose further that markets had decided that because of this alone it was no longer wise to buy UK debt. QE fills the gap, but the flight from UK debt is also a flight from sterling, which depreciates as a result. If the market believes the government is not solvent, it will assume some part of QE is permanent rather than temporary, so inflation expectations rise. This validates the fall in sterling, in the sense that the market does not see any capital gains to be made from its depreciation.

In this thought experiment the UK government is solvent before the crisis, so other things being equal QE will be temporary. We are not talking about a strategy of inflating away the debt. There are two directions we could follow at this point. One would be the idea that a depreciation makes the UK government insolvent: in other words the crisis is self-fulfilling. The analogy is with a funding crisis under a fixed exchange rate, where by pushing up the interest rate on debt, markets can induce the insolvency they fear. (The government was not insolvent under pre-crisis interest rates, but is insolvent if it has to borrow at post-crisis rates.) But its not clear how this would work when exchange rates are flexible. The depreciation would do nothing to raise current or future borrowing costs, or reduce the long term tax base. At some stage markets would realise their mistake. I cannot see why there are multiple equilibria here, but maybe that is a failure of my imagination.

The second direction is to focus on the short run costs of the nominal depreciation. Some of those who commented on my original post talked about a ‘downward spiral’ in sterling. Now of course the depreciation itself may raise inflation to some extent, but this is not an unstable process. At some point sterling falls to a point where the market now thinks it is OK to hold. There is no bottomless pit.

As domestic prices are sticky, the depreciation increases UK competitiveness, which increases the demand for UK goods. If the Zero Lower Bound (ZLB) is a constraint, then this increase in demand reduces or eliminates the constraint. It may also raise inflation because demand is higher, but higher demand is what monetary policy wanted but was unable to achieve. If the depreciation is so large that demand has to be reduced through increasing interest rates, that is fine: this is what monetary policy is for. In essence the ZLB is a welfare reducing constraint that the crisis relieves, and we are all better off.

If that sounds too good to be true, I think it could be. It treats the depreciation as simply a positive demand shock, that first eliminates the cost of the ZLB, and then can be offset by monetary policy. However a sterling crisis may also involve a deterioration in the output/inflation trade-off: equivalent to what macroeconomists call a cost-push shock. Take a basic Phillips curve. If agents start to believe inflation will be higher - even if these beliefs are incorrect (the crisis, and QE, is temporary) - while these mistaken expectations last this is a cost to the economy, because the central bank will have to raise interest rates to prevent these expectations fully feeding through into actual inflation. Either output will be lower, or inflation higher, or both.

Now as long as interest rates stay at zero, this is not a problem: the ZLB constraint still dominates. [1] However if the crisis was big enough, we could overshoot Brad DeLong’s sweet spot, and end up living with a cost-push shock that was more costly than the ZLB constraint. Personally I think this is stretching non-linearities rather too much. First you have to believe that austerity, which reduces debt a bit more quickly than otherwise, is just enough to prevent a crisis, and then that the crisis is so big that it more than offsets the ZLB constraint. Unlikely, but it seems to be a coherent possibility.

But what about 1976, when pressure on sterling led the UK government to seek help from the IMF? Of course that was different, because we were not at the ZLB. I also suspect there were other differences. I was a very junior economist working in the Treasury at the time. I do not remember very much, but what I do remember was that there seemed to be a mindset among senior policy makers that sterling was on a kind of slippery slope. Once it started falling, who knows where it might end up? They seemed to believe there was a bottomless pit, and the IMF loan was required to stop us going there. I suspect that was in part just a lack of familiarity with how flexible exchange rates work. In the end, sterling did stabilise such that some of the IMF loan was not needed, and I wonder how necessary it really was. But if anyone who reads this post knows more about this period, I would be very interested in their thoughts.

[1] A possibility suggested by John McHale is that, although the interest rate set by the Bank of England could stay at its lower bound, there might be an increase in the interest rate spread, such that UK firms or consumers end up paying more. I’m not clear in my own mind why a depreciation would raise this spread. Even if the market believes the UK government is no longer solvent, the central bank still has the ability to prevent a run on UK banks. If the spread did increase, then programmes like Funding For Lending would still be possible.



Saturday 19 October 2013

The decline of evidence based policy

There is a brief description of Cartwright and Hardie’s book ‘Evidence-based policy: a practical guide to doing it better’ on Amazon. It starts: “Over the last twenty or so years, it has become standard to require policy makers to base their recommendations on evidence. That is now uncontroversial to the point of triviality--of course, policy should be based on the facts.”

My immediate reaction is to say ‘unless that policy involves the macroeconomics of fiscal policy’, but once you see one area where evidence is ignored, you begin to see many more. Here are just two that I have read about over the last few weeks, courtesy of the UK government. They can better be described as ‘emotion-based policy’, or ‘election-based policy’, and I fear they may now be the rule rather than the exception.

The first involves what is generally known as the ‘bedroom tax’. The policy reduces the housing benefit payable to tenants deemed to be under-occupying their homes. This policy has undoubtedly caused considerable hardship to many of those affected, but it also saves public money. So crucial in any assessment of the desirability of the policy is how much money it saves. Getting such an estimate is complex, because it will depend on a lot of factors, like whether individuals move in response to losing benefit, where they move to, and so on. The UK’s Department of Work and Pensions has a model that calculated savings of £480m in 2013/14, an estimated it published in June 2012.

So the first thing to do in trying to assess the realism of the £480m figure is to look at the model. We cannot do that, as it is not published. However, Rebecca Tunstall of the Centre for Housing Policy at the University of York has managed to obtain some spreadsheets, using Freedom of Information requests. In a report, she looks at some of the assumptions behind the department’s calculations. Many look very questionable, and as Alex Marsh observes here, there seems to be a pattern - the questionable assumptions tend to overestimate the savings involved. The calculations also ignore some of the other financial costs and consequences that an overall assessment of the policy should take into account. Alex Marsh suspects that allowing for these, and taking into account the evidence we are now getting about how people are responding, the savings created by the policy could disappear completely.

If policy was evidence-based, then either the government would be disputing in detail Professor Tunstall’s analysis, or rethinking the policy. Neither are likely to happen for one simple reason. The policy was never evidence based. It was instead inspired by the tabloid led attack on welfare recipients, and the idea that it was unjust that ‘hard working taxpayers’ should fund a spare room for benefit recipients. Here for example is Stephen Glover writing in the Daily Mail: “The notion that many families not on welfare don’t have the luxury of a spare room, and may have to have one or two people in every bedroom, is foreign to the head-in-the-clouds types that proliferate at the BBC.” Now that feeling of unfairness is real enough, but it is not based on evidence either about what impact the policy might have, or whether it will actually save any money.

The second example is the recent ‘clampdown’ announced by the UK Home Secretary on ‘health tourism’. Among the measures included in the government's new immigration bill is a £200 charge on all temporary migrants for using the NHS and a requirement for GPs to check the migration status of new patients. Both policies are aimed at preventing migrants travelling to the UK to seek free healthcare. Conservatives has been particularly keen to attack ‘health tourists’ from other EU countries, thereby clocking two election sensitive issues among potential UKIP voters: migration and the EU.

So what evidence does the UK government have on the scale of this EU health tourism? The answer is that it has none. A classified document had the following very telling phrase: "we consider that these questions place too much emphasis on quantitative evidence". So considerable extra costs are going to be imposed on GPs and other parts of the NHS to help deal with a problem which may be trivial.

However the European Commission has succeeded in getting some evidence on ‘benefit tourism’, which it recently published. As Jonathan Portes reports, approximately 4% of those claiming unemployment benefit (job seekers allowance) in the UK are EU migrants, although they represent well over 5% of those in work. More generally, as the Commission says: “Mobile EU citizens are less likely to receive disability and unemployment benefits in most countries studied.” For much the same reason, the proportional demands made by EU migrants on the UK health service are likely to be less than the native population. They may also choose to avoid the NHS, for a variety of reasons.

Which you might think is a bit embarrassing for a government that is about to create a whole raft of additional bureaucratic costs to deal with this ‘problem’. Well not embarrassing if you read the Daily Telegraph or Daily Mail. Their reading of the Commission report was that it showed “more than 600,000 unemployed European Union migrants are living in Britain at a cost of £1.5 billion to the NHS alone”. Only one slight problem - the true number is 38,000. So how did the Daily Telegraph manage to inflate the true number by a factor of 15? Because they ‘confused’ unemployed with non-employed, where the latter include students (of which there are many), retired people, carers and others with family responsibilities. As Richard Exell says: “Our government may be chumps when it comes to evidence-based policy making, but they can always rely on world-class distortion to see them through.” The report remains on the Telegraph’s website, uncorrected.


I’ve included this last paragraph to make a very simple point. Some commentators tend to argue that we make too much of what happens in the media (see Chris Dillow here for example). I think this is plain wrong. Information is vital. People make judgements based on the information they receive. As one student said to me this week, they knew that the Obama stimulus package had had little impact on the US economy, because they had read this in the FT and the Economist. (For better analysis, follow the links from here.) If people have distorted information (or if those bringing a bit of reality are vilified), politicians have no incentive to base policy on actual evidence. The decline in evidence-based policy, and the declining importance of facts for much of the media, are not unrelated.  


Postscript, 20th October

If a reputable newspaper makes a mistake, it acknowledges that mistake. Another kind of newspaper, it seems, can try shouting down those who pointed out the mistake. That was what the Daily Mail did over its attempt to slur Ralph Miliband as the ‘man who hated Britain’. So today we find the Telegraph doing the same over its reporting that “more than 600,000 unemployed European Union migrants are living in Britain”. Rather than admit that it got that wrong, it instead tries to pretend that it is in fact battling some kind of great conspiracy emanating from Brussels.

The lead article on its website today (19 October) by it chief reporter Robert Mendick could not be a better illustration of the points I made in the last paragraph of my post. The article’s first paragraph says
“The [Commission] study — whose details were first disclosed in The Telegraph — showed that more than 600,000 “non-active” EU migrants were living in the UK at a possible cost to the NHS alone of £1.5 billion a year.”
Here is the opening paragraph by the same reporter in the original (12 October) piece:
“More than 600,000 unemployed European Union migrants are living in Britain at a cost of £1.5 billion to the NHS alone, according to an EU report.”
Virtually identical – except for the little details of replacing unemployed with non-active (the writer puts non-active in quotes, apparently unable to translate this into ordinary language, like students, housewives, retired people), and the insertion of ‘possible’ before cost. No acknowledgement of these changes, but instead an attempt to manufacture another story, the ‘developments’ of which are

(a) some politicians did not like the conclusion of the Commission report. (No, really?)

(b) quotes from an academic at Oxford who says that “There is no problem with the numbers [in the report]. The issue is the interpretation of those numbers.” (Heady stuff!)

(c) the paper has found out that the “independent consultancies who wrote the report were awarded EU contracts worth more than £70 million over six years.” (Ah, the plot thickens.)

(d) a BBC report which attempted to reflect the facts had been accused as being unbalanced by a Conservative minister. (My god, this is serious.)

Then we have: “Evidence of mounting public concern in the EU’s biggest economies over migration emerged in a poll yesterday which showed that the introduction of restrictions on EU migrants’ rights is backed by 83 per cent of Britons..” I wonder where people are getting their information from!


And then there are attempts to discredit opponents. The European commissioner in charge of the department that published the study is quoted in a paragraph that begins: “Mr Andor, a socialist, said: ...”. Later on we have: “It also emerged that one of the main supporters of the report ... was in receipt of more than £600,000 of EU funding for the year ending March 2012 for his think tank.“  Yes, that is Jonathan Portes, director of the National Institute, which has been doing – shock, horror - research funded by the Commission. As Portes points out in the article, it has also been doing research funded by the UK government, but clearly the Commission has managed to exert its evil influence on Mr Portes where the UK government has failed. How can that be? The article helpfully informs us that he was a “former senior economics adviser to the last Labour government “. He was actually a civil servant working under both governments (he has an interesting account of his role advising Norma Lamont in 1992 here), but I guess that is another one of those little details, like the difference between ‘unemployed’ and ‘non-active’.  

Thursday 17 October 2013

How not to run fiscal policy: more lessons from the Eurozone

In my last but one post I noted the case of European Commission estimates of the output gap as an example of what can happen if you do not allow for the asymmetry implied by firms’ reluctance to cut nominal wages. Their methodology implied that the natural rate in Spain had more than doubled in a decade, which seems nonsensical. However the economists at the Commission at least came to recognise the problem, and had proposed making some changes to get more reasonable numbers. As I will explain, the methodology they use is reasonable given the state of macroeconomic arts, which is why in that earlier post I used this as an example of a failure of macroeconomics generally, rather than economists at the Commission. In this post I want to note what happened next, which in contrast does seem to reflect badly on how the Commission works.

But before getting to that, a few background points. First why this matters. As part of the “two pack” (don’t ask), Eurozone economies will now have to submit their budgets for approval by the Commission. Approval will depend, among other things, on the Commissions calculations of the structural budget deficit, which is the deficit corrected for the cycle. Measuring this is difficult, because we do not observe the output gap, which itself depends on both the natural rate of unemployment and the underlying trend in productivity (technical progress), which we also do not observe.

You can say at this point why bother - just stick to looking at the actual deficit. That’s an overreaction. For most Eurozone countries we are pretty clear about the sign of the output gap, so making some adjustment should be better than doing nothing. To see the kind of stupidity that arises from just focusing on actual deficits, see the Netherlands.

In principle we can use information on what we do observe, like wage inflation, to make inferences about what the natural rate of unemployment is. So if wage inflation depends on the gap between actual unemployment and the natural rate (called the NAWRU by the Commission), we can switch things around to make this an equation telling us what the natural rate is, given observations on actual wage inflation.

There are three kinds of problem that arise in doing this. The first is that our estimates will be only be as good as the specification of the wage equation. If we leave important factors out of the specification of the wage equation (like a reluctance to cut the nominal wages of existing workers because of morale effects - see this paper by Eliaz and Spiegler for example), we will get our estimates of the natural rate wrong. The second is that some of the things that we are sure do determine wage inflation, like inflation expectations, may be difficult to measure. Paul Krugman discusses the possibility that inflation expectations in Spain might have become anchored here. Finally no equation is perfect, and if we do not allow for these inevitable errors we will get a ridiculously bumpy series for the natural rate. So we need to apply some kind of smoothing.

The way the Commission tackled these problems is described in detail here. The fact that they use a Kalman filter to deal with smoothing issues seems sensible to me: I’m quite fond of the Kalman filter, ever since I wrote a paper with Andrew Harvey, Brian Henry and Simon Peters that used it to estimate labour productivity back in 1986. [1] I suspect they are getting the implausible results for one of the other two reasons. But the key thing to take away is that there are no easy answers here, and this kind of problem requires quite specific macroeconomic expertise. Furthermore, the importance of particular issues may vary between countries, so country specific expertise should be helpful.

Following on from this last point, it is clear that this is not just an issue for Spain. As the Irish Fiscal Advisory Council notes here (p68), the Commission estimates for the natural rate in Ireland also look implausibly high. So it is not surprising that the Commission would want to adjust their methodology to give more plausible numbers. (Matthew Dalton looks at the implications of their methodology for the US here.)

This all matters. If the Commission underestimates the output gap because it overestimates the natural rate of unemployment, then it will overestimate the structural budget deficit, and the country concerned will come under considerable pressure to undertake further austerity. Readers will know why I think that will be a very costly outcome. (More on this from Cohen-Setton and Valla here.)

At least economists at the Commission have now recognised the problem, and changed their estimates. But as Matthew Dalton reports

“The change was approved by technical experts at a meeting last week and was expected to be supported at a Tuesday meeting of more senior officials in Brussels. But an article published in The Wall Street Journal about last week's decision generated concern in some national capitals about its effects on budget policies, an EU official said. The new methodology will be sent back to the expert committee for further discussions, in an effort to understand what its impact will be on all 28 EU countries, the official said.”

Reassuringly, Dalton adds that "The commission is fully on board with the new methodology," the official said. "We believe it is superior." But he also notes that the new methodology “was supposed to have been used by the commission in its next round of estimates of the structural deficit, to be published in November. Now that will have to wait, if it is approved at all.” As one of my commentators pointed out, this article in a leading German newspaper may have contributed to this official hesitation.

So the Commission will go on making estimates that it knows are overestimating structural budget deficits, because of ‘concern in some capitals’ about the implications of using better estimates! None of this does the Commission any good in terms of its competence to help determine national fiscal policy. (Of course nothing can top the incompetence recently shown by the US congress, but that is no excuse.) 

Luckily there is an obviously better way to proceed, even within the confines of the deeply flawed Fiscal Compact. Many Eurozone countries already have their own ‘fiscal councils’: independent bodies set up to provide scrutiny of national fiscal policy. It should be central to the mission of these bodies to estimate the output gap and structural deficit, as it is impossible to look at fiscal sustainability without doing so. So why not get estimates of the output gap from these institutions who will be able to take into account country specific factors, and use the academic expertise that exists in those countries to maximum effect. (Spain and Ireland are hardly short of good macroeconomists.) Unlike the governments of those countries, fiscal councils should not be prone to bias in producing these estimates. The Commission can play a coordinating role, getting experts from the national fiscal councils together to share ideas and expertise. This seems to me a clearly better way to proceed, unless of course your goal is to maximise the influence of the Commission.



[1] Harvey, A., Henry, S.G.B., Peters, S. and Wren-Lewis, S. (1986), Stochastic trends in dynamic regression models: an application to the employment output relationship, Economic Journal, vol 96 pp 975-985.

Tuesday 15 October 2013

Microfoundations and Macro Wars

There have been two strands of reaction to my last post. One has been to interpret it as yet another salvo in the macro wars. The second has been to deny there is an issue here: to quote Tony Yates: “The pragmatic microfounders and empirical macro people have won out entirely”. If people are confused, perhaps some remarks by way of clarification might be helpful.

There are potentially three different debates going on here. The first is the familiar Keynesian/anti-Keynesian debate. The second is whether ‘proper’ policy analysis has to be done with microfounded models, or whether there is also an important role for more eclectic (and data-based) aggregate models in policy analysis, like IS-LM. The third is about how far microfoundation modellers should be allowed to go in incorporating non-microfounded (or maybe behavioural) relationships in their models.

Although all three debates are important in their own right, in this post I want to explore the extent to which they are linked. But I want to say at the outset what, in my view, is not up for debate among mainstream macroeconomists: microfounded macromodels are likely to remain the mainstay of academic macro analysis for the foreseeable future. Many macroeconomists outside the mainstream, and some other economists, might wish it otherwise, but I think they are wrong to do so. DSGE models really do tell us a lot of interesting and important things.

For those who are not economists, let’s be clear what the microfoundations project in macro is all about. The idea is that a macro model should be built up from a formal analysis of the behaviour of individual agents in a consistent way. There may be just a single representative agent, or increasingly heterogeneous agents. So a typical journal paper in macro nowadays will involve lots of optimisation by individual agents as a way of deriving aggregate relationships.

Compare this to two alternative ways of ‘doing macro’. The first goes to the other extreme: choose a bunch of macro variables, and just look at the historic relationship between them (a VAR). This uses minimal theory, and the focus is all about the past empirical interaction between macro aggregates. The second would sit in between the two. It might start off with aggregate macro relationships, and justify them with some eclectic mix of theory and empirics. You can think of IS/LM as an example of this third way. In reality there is probably a spectrum of alternatives here, with different mixes between theoretical consistency and consistency with the data (see this post).

In the 1960s and 1970s, a good deal of macro analysis in journals was of this third type. The trouble with this approach, as New Classical economists demonstrated, was that the theoretical rationale behind equations often turned out to be inadequate and inconsistent. The Lucas critique is the most widely quoted example where this happens. So the microfoundations project said let’s do the theory properly and rigorously, so we do not make these kind of errors. In fact, let’s make theoretical (‘internal’) consistency the overriding aim, such that anything which fails on these grounds is rejected. There were two practical costs of this approach. First, doing this was hard, so for a time many real world complexities had to be set aside (like the importance of banks in rationing credit, for example, or the reluctance of firms to cut nominal wages). This led to a second cost, which was that less notice was taken of how each aggregate macro relationship tracked the data (‘external’ consistency). To use a jargon phrase that sums it up quite well: internal rather than external consistency became the test of admissibility for these models.

The microfoundations project was extremely successful, such that it became generally accepted among most academics that all policy analysis should be done with microfounded models. However I think macroeconomists are divided about how strict to be about microfoundations: this is the distinction between purists and pragmatists that I made here. Should every part of a model be microfounded, or are we allowed a bit of discretion occasionally? Plenty of ‘pragmatic’ papers exist, so just referencing a few tells us very little. Tony Yates thinks the pragmatists have won, and I think David Andolfatto in a comment on my post agrees. I would like to think they are right, but my own experience talking to other macroeconomists suggests they are not.

But let’s just explore what it might mean if they were right. Macroeconomists would be quite happy incorporating non-microfounded elements into their models, when strong empirical evidence appeared to warrant this. Referees would not be concerned. But there is no logical reason to only include one non-microfounded element at a time: why not allow more than one aggregate equation to be data rather than theory based? In that case, ‘very pragmatic’ microfoundation models could begin to look like the aggregate models of the past, which used a combination of theory and empirical evidence to justify particular equations.

I would have no problem with this, as I have argued that these more eclectic aggregate models have an important role to play alongside more traditional DSGE models in policy analysis, particularly in policy making institutions that require flexible and robust tools. Paul Krugman is fond of suggesting that IS-LM type models are more useful than microfounded models, with the latter being a check on the former, so I guess he wouldn’t worry about this either. But others do seem to want to argue that IS-LM type models should have no place in ‘proper’ policy analysis, at least in the pages of academic journals. If you take this view but want to be a microfoundations pragmatist, just where do you draw the line on pragmatism?

I have deliberately avoided mentioning the K word so far. This is because I think it is possible to imagine a world where Keynesian economics had not been invented, but where debates over microfoundations would still take place. For example, Heathcote et al talk about modelling ‘what you can microfound’ versus ‘modelling what you can see’ in relation to the incompleteness of asset markets, and I think this is a very similar purist/pragmatist microfoundations debate, but there is no direct connection to sticky prices.

However in the real world where thankfully Keynesian economics does exist, I think it becomes problematic to be both a New Keynesian and a microfoundations purist. First, there is Paul Krugman’s basic point. Before New Keynesian theory, New Classical economists argued that because sticky wages and prices were not microfounded, they should not be in our models. (Some who are unconvinced by New Keynesian ideas still make that case.) Were they right at the time? I think a microfoundations purist would have to say yes, which is problematic because it seems an absurd position for a Keynesian to take. Second, in this paper I argued that the microfoundations project, in embracing sticky prices, actually had to make an important methodological compromise which a microfoundations purist should worry about. I think Chari, Kehoe and McGrattan are making similar kinds of points. Yet my own paper arose out of talking to New Keynesian economists who appeared to take a purist position, which was why I wrote it.

It is clear what the attraction of microfoundations purity was to those who wanted to banish Keynesian theory in the 1970s and 1980s. The argument of those who championed rational expectations and intertemporal consumption theory should have been: your existing [Keynesian] theory is full of holes, and you really need to do better – here are some ideas that might help, and let’s see how you get on. Instead for many it was: your theory is irredeemable, and the problems you are trying to explain (and alleviate) are not really problems at all. In taking that kind of position it is quite helpful to follow a methodology where you get rather a lot of choice over what empirical facts you try and be consistent with.

So it is clear why the microfoundations debate is mixed up with the debate over Keynesian economics. It also seems clear to me that the microfoundations approach did reveal serious problems with the Keynesian analysis that had gone before, and that the New Keynesian analysis that has emerged as a result of the microfoundations project is a lot better for it. We now understand more about the dynamics of inflation and business cycles and so monetary policy is better. This shows that the microfoundations project is progressive.

But just because a methodology is progressive does not imply that it is the only proper way to proceed. When I wrote that focusing on microfoundations can distort the way macroeconomists think, I was talking about myself as much as anyone else. I feel I spend too much time thinking about microfoundations tricks, and give insufficient attention to empirical evidence that should have much more influence on modelling choices. I don’t think I can just blame anti-Keynesians for this: I would argue New Keynesians also need to be more pragmatic about what they do, and more tolerant of other ways of building macromodels.