Saturday, August 28, 2010

Lessons from the Kocherlakota Controversy

In a speech last week the President of the Minneapolis Fed, Narayana Kocherlakota, made the following rather startling claim:
Long-run monetary neutrality is an uncontroversial, simple, but nonetheless profound proposition. In particular, it implies that if the FOMC maintains the fed funds rate at its current level of 0-25 basis points for too long, both anticipated and actual inflation have to become negative. Why? It’s simple arithmetic. Let’s say that the real rate of return on safe investments is 1 percent and we need to add an amount of anticipated inflation that will result in a fed funds rate of 0.25 percent. The only way to get that is to add a negative number—in this case, –0.75 percent.

To sum up, over the long run, a low fed funds rate must lead to consistent—but low—levels of deflation.
The proposition that a commitment by the Fed to maintain a low nominal interest rate indefinitely must lead to deflation (rather than accelerating inflation) defies common sense, economic intuition, and the monetarist models of an earlier generation. This was pointed out forcefully and in short order by Andy Harless, Nick Rowe, Robert Waldmann, Scott Sumner, Mark Thoma, Ryan Avent, Brad DeLongKarl Smith, Paul Krugman and many other notables.

But Kocherlakota was not without his defenders. Stephen Williamson and Jesus Fernandez-Villaverde both argued that his claim was innocuous and completely consistent with modern monetary economics. And indeed it is, in the following sense: the modern theory is based on equilibrium analysis, and the only equilibrium consistent with a persistently low nominal interest rate is one in which there is a stable and low level of deflation. If one accepts the equilibrium methodology as being descriptively valid in this context, one is led quite naturally to Kocherlakota's corner.

But while Williamson and Fernandez-Villaverde interpret the consistency of Kocherlakota's claim with the modern theory as a vindication of the claim, others might be tempted to view it as an indictment of the theory. Specifically, one could argue that equilibrium analysis unsupported by a serious exploration of disequilibrium dynamics could lead to some very peculiar and misleading conclusions. I have made this point in a couple of earlier posts, but the argument is by no means original. In fact, as David Andolfatto helpfully pointed out in a comment on Williamson's blog, the same point was made very elegantly and persuasively in a 1992 paper by Peter Howitt.

Howitt's paper is concerned with the the inflationary consequences of a pegged nominal interest rate, which is precisely the subject of Kocherlakota's thought experiment. He begins with an old-fashioned monetarist model in which output depends positively on expected inflation (via the expected real rate of interest), realized inflation depends on deviations of output from some "natural" level, and expectations adjust adaptively. In this setting it is immediately clear that there is a "rational expectations equilibrium with a constant, finite rate of inflation that depends positively on the nominal rate of interest" chosen by the central bank. This is the equilibrium relationship that Kocherlakota has in mind: lower interest rates correspond to lower inflation rates and a sufficiently low value for the former is associated with steady deflation. 

The problem arises when one examines the stability of this equilibrium. Any attempt by the bank to shift to a lower nominal interest rate leads not to a new equilibrium with lower inflation, but to accelerating inflation instead. The remainder of Howitt's paper is dedicated to showing that this instability, which is easily seen in the simple old-fashioned model with adaptive expectations, is in fact a robust insight and holds even if one moves to a "microfounded" model with intertemporal optimization and flexible prices, and even if one allows for a broad range of learning dynamics. The only circumstance in which a lower nominal rate results in lower inflation is if individuals are assumed to be "capable of forming rational expectations ab ovo".

Howitt places this finding in historical context as follows (emphasis added):
In his 1968 presidential address to the American Economic Association, Milton Friedman argued, among other things, that controlling interest rates tightly was not a feasible monetary policy. His argument was a variation on Knut Wicksell's cumulative process. Start in full employment with no actual or expected inflation. Let the monetary authority peg the nominal interest rate below the natural rate. This will require monetary expansion, which will eventually cause inflation. When expected inflation rises in response to actual inflation, the Fisher effect will put upward pressure on the interest rate. More monetary expansion will be required to maintain the peg. This will make inflation accelerate until the policy is abandoned. Likewise, if the interest rate is pegged above the natural rate, deflation will accelerate until the policy is abandoned. Since no one knows the natural rate, the policy is doomed one way or another.

This argument, which was once quite uncontroversial, at least among monetarists, has lost its currency. One reason is that the argument invokes adaptive expectations, and there appears to be no way of reformulating it under rational expectations... in conventional rational expectations models, monetary policy can peg the nominal rate... without producing runaway inflation or deflation... Furthermore... pegging the nominal rate at a lower value will produce a lower average rate of inflation, not the ever-higher inflation predicted by Friedman...

Thus the rational expectations revolution has almost driven the cumulative process from the literature. Modern textbooks treat it as a relic of pre-rational expectations thought... contrary to these rational expectations arguments, the cumulative process is not only possible but inevitable, not just in a conventional Keynesian macro model but also in a flexible-price, micro-based, finance constraint model, whenever the interest rate is pegged... the essence of the cumulative process lies not in an economy's rational expectations equilibria but in the disequilibrium adjustment process by which people try to acquire rational expectations... under a wide set of assumptions, the process cannot converge if the monetary authority keeps interest rates pegged... the cumulative process is a manifestation of this nonconvergence. 
Thus the cumulative process should be regarded not as a relic but as an implication of real-time belief formation of the sort studied in the literature on convergence (or nonconvergence) to rational expectations equilibrium... Perhaps the most important lesson of the analysis is that the assumption of rational expectations can be misleading, even when used to analyze the consequences of a fixed monetary regime. If the regime is not conducive to expectational stability, then the consequences can be quite different from those predicted under rational expectations... in general, any rational expectations analysis of monetary policy should be supplemented with a stability analysis... to determine whether or not the rational expectations equilibrium could ever be observed. 
To this I would add only that a stability analysis is a necessary supplement to equilibrium reasoning not just in the case of monetary policy debates, but in all areas of economics. For as Richard Goodwin said a long time ago, an "equilibrium state that is unstable is of purely theoretical interest, since it is the one place the system will never remain."

---

Update (8/29). From a comment by Robert Waldmann:
I think that it is important that in monetary models there are typically two equilibria -- a monetary equilibrium and a non-monetary equilibrium.

The assumption that the economy will end up in a rational expectations equilibrium does not imply that a low nominal interest rate leads to an equilibrium with deflation. It might lead to an equilibrium in which dollars are worthless.

I'd say the experiment has been performed. From 1918 through (most of) 1923 the Reichsbank kept the discount rate low (3.5% IIRC) and met demand for money at that rate.

The result was not deflation. By October 1923 the Reichsmark was no longer used as a medium of exchange.
In fact, the only stable steady state under a nominal interest rate peg in the Howitt model is the non-monetary one.

Thursday, August 19, 2010

On Broken Trades and Bailouts

Back in 1980, Avraham Beja and Barry Goldman published a theoretical paper in the Journal of Finance that explored the manner in which the composition of trading strategies in an asset market affects the volatility of prices. Their main insight was that if the prevalence of momentum based strategies was too large relative to that of strategies based on fundamental analysis, then the dynamics of asset prices would be locally unstable: departures of prices from fundamentals would be amplified rather than corrected over time. More importantly, they argued that the relationship between the composition of strategies and market stability was discontinuous: there was a threshold (bifurcation) value of this population mixture that separated the stable from the unstable regime, and an imperceptible change in composition that took the market across the threshold could result in dramatic increases in volatility.

The Beja/Goldman analysis can be taken a step further: not only does market stability depend on the composition of trading strategies, but the profitability of different trading strategies, and hence changes in their relative population shares over time, depend very much on whether one is in a stable or an unstable regime. In a stable regime prices track fundamentals reasonably well, which makes it possible for technical strategies to extract information from incoming market data without going through the trouble and expense of fundamental research. Such strategies can therefore prosper and proliferate, provide that they remain sufficiently rare. But if they become too common, markets are destabilized, asset price bubbles can form, and the value of fundamental information rises. When a major correction arrives, it is the fundamental strategies that prosper, the composition of trading strategies is shifted accordingly, and market stability is restored for a time. This process of endogenous regime switching provides one possible interpretation of the empirical phenomenon known as volatility clustering.

From this perspective, it is critically important that technical trading strategies to be allowed to suffer losses when market instability arises. The cancellation of trades in almost 300 securities after the flash crash of May 6 did exactly the opposite, by providing an implicit subsidy to destabilizing strategies. The excuse that this was done to protect retail investors whose stop orders were executed as prices fell to insane levels is unconvincing. According to the SEC's own report on the crash, most trades against stub quotes of five cents or less were short sales, and there was also considerable upward instability, with prices rising well beyond the reach of ordinary retail investors. (Shares in Sotheby's, for instance, changed hands at ten million dollars per round lot.) The cancellation of trades was therefore a bailout of some funds (heavily reliant on algorithmic trading) at the expense of others, and this prevented a stabilizing shift in the market composition of trading strategies.

A similar argument could be made about the effects of the Troubled Asset Relief Program. It has recently been claimed, for instance by Alan Blinder and Mark Zandi, that TARP has been a "substantial success" because it averted a second Great Depression at a cost to taxpayers that is turning out to be much lower than originally feared:
The Troubled Asset Relief Program was controversial from its inception. Both the program’s $700 billion headline price tag and its goal of “bailing out” financial institutions—including some of the same institutions that triggered the panic in the first place—were hard for citizens and legislators to swallow. To this day, many believe the TARP was a costly failure. In fact, TARP has been a substantial success, helping to restore stability to the financial system and to end the freefall in  housing and auto markets. Its ultimate cost to taxpayers will be a small fraction of the headline $700 billion figure: A number below $100 billion seems more likely to us, with the bank bailout component probably turning a profit.
Yves Smith is unpersuaded by such figures, which she attributes to "back door, less visible bailouts, super cheap interest rates, [and] regulatory forbearance." But even if one were to take at face value the Blinder-Zandi estimates of the revenue consequences of TARP, there remain potentially harmful effects on the size composition of firms and the distribution of financial practices. The institutions that were bailed out made directional bets that either failed directly, or were with counterparties that would have failed in the absence of government support. Smaller institutions making such mistakes were allowed to go under, while larger ones were bailed out. Quite apart from the unfairness of this, the policy could be severely damaging to the stability of the system over the medium run.

This point was made a couple of months ago in a speech by Richard Fisher of the Dallas Fed (and expanded upon by Tyler Durden and Ashwin Parameswaran shortly thereafter):
Big banks that took on high risks and generated unsustainable losses received a public benefit... As a result, more conservative banks were denied the market share that would have been theirs if mismanaged big banks had been allowed to go out of business. In essence, conservative banks faced publicly backed competition...
The system has become slanted not only toward bigness but also high risk... Clearly, if the central bank and regulators view any losses to big bank creditors as systemically disruptive, big bank debt will effectively reign on high in the capital structure. Big banks would love leverage even more, making regulatory attempts to mandate lower leverage in boom times all the more difficult. In this manner, high risk taking by big banks has been rewarded, and conservatism at smaller institutions has been penalized...

It is not difficult to see where this dynamic leads—to more pronounced financial cycles and repeated crises.
Fisher goes on to argue for strict limits on the size of individual financial institutions relative to that of the industry. So does Nouriel Roubini:
Greed has to be controlled by fear of loss, which derives from knowledge that the reckless institutions and agents will not be bailed out. The systematic bailouts of the latest crisis – however necessary to avoid a global meltdown – worsened this moral-hazard problem. Not only were “too big to fail” financial institutions bailed out, but the distortion has become worse as these institutions have become – via financial-sector consolidation – even bigger. If an institution is too big to fail, it is too big and should be broken up.
But were the bailouts really necessary to avoid a global meltdown? Blinder and Zandi argue that the alternative would have been completely catastrophic:
The financial policy responses were especially important. In the scenario without them, but including the fiscal stimulus, the recession would only now be winding down, a full year after the downturn’s actual end... The differences between the baseline and the scenario based on no financial policy responses... represent our estimates of the combined effects of the various policy efforts to stabilize the financial system — and they are very large. By 2011, real GDP is almost $800 billion (6%) higher because of the policies, and the unemployment rate is almost 3 percentage points lower. By the second quarter of 2011 — when the difference between the baseline and this scenario is at its largest — the financial-rescue policies are credited with saving almost 5 million jobs.
Here the baseline is the set of policies actually pursued (including fiscal and financial policies) and it is being compared to the case of "no financial policy responses." However, as Yves Smith and Barry Ritholtz have pointed out, this is an absurd counterfactual. Barry argues that  the proper point of comparison ought to be what should have been done, which in his view is the following:
One by one, we should have put each insolvent bank into receivership, cleaned up the balance [sheet], sold off the bad debts for 15-50 cents on the dollar, fired the management, wiped out the shareholders, and spun out the proceeds, with the bondholders taking the haircut, and the taxpayers on the hook for precisely zero dollars. Citi, Bank of America, Wamu, Wachovia, Countrywide, Lehman, Merrill, Morgan, etc. all of them should have been handled this way.

The net result of this would have been more turmoil, lower stock prices, and a sharper, but much shorter economic contraction. It would have been painful and disruptive — like emergency surgery is — but its better than an exploded appendix.

And today, we would have a much healthier economy.
Whether or not one agrees with this assessment, Yves and Barry are surely correct in arguing that counterfactuals other than the hands-off policy ought to be considered before one accepts the emerging conventional wisdom that the authorities handled the crisis well.

What the broken trades trades of May 6 and the bailouts of 2008 have in common is that they were both impulsive decisions, designed to deal with immediate concerns, and executed with little regard for their long term consequences. As I said in an earlier post, these decisions were made under enormous pressure with little time for reflection, and mistakes made in such circumstances would ordinarily be forgivable. But to insist that the best available course of action was taken, and that any alternative would have had devastating economic costs, is neither credible nor wise.

---

Update (8/20). The comments on this post by Andy Harless, David Merkel and Economics of Contempt are worth reading. Andy thinks that I am attacking a straw man and that the Ritholtz proposal was not even feasible, let alone optimal. David questions the use by Blinder and Zandi of a forecasting model to generate counterfactuals, given the appalling performance of such models in predicting the crisis in the first place. And here's Economics of Contempt:
"Smaller institutions making such mistakes were allowed to go under, while larger ones were bailed out."

I have to take issue with that statement. Yes, large banks were bailed out, but hundreds upon hundreds of small banks were bailed out too! Fully 836 financial institutions were bailed out using TARP money, the vast majority of which were small banks. While it's true that most of the bank failures have been small banks, there were large banks that were allowed to fail too -- e.g., Lehman, WaMu.

As for Barry Ritholtz's alternative scenario, there are too many basic factual errors to take it seriously. For one thing, receivership wasn't available to non-commercial banks. It was also legally impossible to separate AIGFP from AIG, since AIG had unconditionally guaranteed all of AIGFP's liabilities, and all their trades included cross-default provisions. A lot of the actions Barry proposes were literally impossible to do. It's simply not a credible list, and I'm surprised that you would fall for it.

Finally, I think it's unfair to say that the bailouts created bad precedents without also mentioning that we now have a resolution authority for non-bank financial institutions. How are decisions that were made without the availability of a resolution authority proper precedents for decisions that will be made with a resolution authority? You would never say that decisions made in pre-FDIC bank failures are proper precedents for post-FDIC bank failures, would you?
These are all good points. I probably should have been a bit more skeptical when discussing the Ritholtz scenario. I did not intend to endorse his proposal, only to suggest that we need to think through a broad range of counterfactuals in evaluating the response to the crisis. But of course these counterfactuals must be feasible given the tools available at the time, and his point about the resolution authority is well taken.

What bothered me most about Geithner's congressional testimony was his claim that "the government’s strategy regarding AIG was essential to our success in confronting the worst financial crisis in generations." That is, in averting an economic calamity, there was no alternative to the government making massive payouts on privately negotiated speculative bets. This is a bold claim with very serious consequences and ought not to be made lightly. In particular, the consequences of alternative scenarios has to be traced out with some seriousness.

Wednesday, August 18, 2010

On Teachable Moments and Non-Conversations

The latest in the series of consistently interesting dialogues between Glenn Loury and John McWhorter has been posted:


Among the themes explored in this conversation is the manner in which a steady stream of "race-related events" are turned in a media frenzy into "teachable moments" and calls for a "national conversation on race." As examples they cite the Gates arrest a year ago, the recent vilification and vindication of Shirley Sherrod, Harry Reid's characterization of Obama's dialect, Hilary Clinton's comments on the King legacy, and Jim Clyburn's response to these comments -- all part of a long list of racially charged incidents that briefly occupy the national spotlight from time to time.
Glenn, for one, is terribly weary of the "melodramatic dance that we do about race and racial etiquette in this society" in the wake of such events:
I’m tired of the national non-conversation on race... we’re not talking about real things… we’re mired in a kind of superficial morality of expressive convention… what can and cannot properly be said by right thinking people… fingerpointing… grandstanding… moralizing… real genuine moral engagement with serious problems in our society… gets short shrift while everyone is posturing… checking the scorecard to see what the exactly correct way of expressing something is… I’m just so weary of this.
John agrees that the "posturing" and "witch-hunting" is little more than "a theatrical production that we are taught to pretend is an engagement with something substantial."
In contrast to the loud (if brief) responses to these so-called teachable moments, there is almost total silence in the public sphere about the really serious issues with which we need to be grappling. Here's Glenn:
One million African American men under lock and key on any given day… structured, reproduced inequality of a raced nature… violent crime perpetrated by black people often on other black people at enormous scale… children with no prospect to realize their God-given talents or their human potential because the institutions designed to facilitate their development have failed them totally… these are the things that demonstrate that society is not in a post-racial moment, and they turn out to be a lot less about theatrics and a lot more about politics, policy, candor… if we wanted to have a conversation on race we’d have to start with some of the really hard stuff, and I’m afraid it wouldn’t be as easy as hunting out politically incorrect racists and then calling them what they are.
Ta-Nehisi Coates has also recently addressed the issue of national non-conversations, arguing that we learn nothing because we aspire not to know:
I keep hearing people bantering about this notion of a national conversation on race, and I have finally figured out why it rankles so... Expecting an American conversation on race in this country, is like expecting financial advice from someone who prefers to not check their bank balance. It's not that the answers, themselves, are pre-ordained, its that we are more interested in  answers than questions, in verdicts than evidence...

Put bluntly, this is a country too ignorant of itself to grapple with race in any serious way. The very nomenclature -- "conversation on race" -- betrays the unseriousness of the thing by communicating the sense that race can be boxed from the broader American narrative, that you can somehow talk about Thomas Jefferson without Sally Hemmings; that you can discuss Andrew Jackson without discussing his betrayal of the black artillerymen who fought at the Battle of New Orleans; that you can discuss the suffrage without Sojourner Truth, Ida B. Wells or Frederick Douglass; that you can discuss temperance without understanding the support of the Klan; that you can discuss the path to statehood in Florida without discussing Fort Gadsen; that you can talk Texas without understanding cotton, and so on.

It's not so much that we don't know -- it's that we aspire to not know. The ignorance of the African-American thread in the broader American quilt -- the essential nature of that thread -- is willful... Race isn't a "distraction" from Obama's agenda -- it's the compromised, unsure ground upon which this country walks everyday...

Talk is overrated. There can be no talk with people who've conditioned themselves out of listening. This is the country we've made. This is the country we deserve.
Coates returns to this theme in a follow-up post prompted by Jim Webb's column on diversity:
I think the fact that we don't really have the implements to carry out this much ballyhoed conversation were really brought home by Jim Webb's piece "The Myth Of White Privilege"... The title, itself, is a device meant to drive conservatives to cheering, liberals to howling, and the whole of them all to page-clicking and reading, In short, it proceeds not from any desire to conversate, as we say, but to provoke strong emotion, and hopefully, page-views... I hate unthinking equivalence, but its quite clear to me that liberals and conservatives both have prominent camps that enjoy yelling.

But its still worth teasing out the intentions and the argument. The questions, themselves, are serious and worthy ones: What is "white privilege" to those who are white and poor, seemingly in perpetuity? Does Affirmative Action exist to promote diversity or historical redress? Is it both? If so, why? Who should be on the receiving end of such redress? Do immigrants from the Caribbean and Africa count? How do Native Americans fit in? What does it mean to have Affirmative Action for white women, many of whom will in turn marry white men?

How do we, specifically, define Affirmative Action? Is it any effort at diversity by anyone, anywhere? Do the questions I listed change depending on the venue? When I was hired, surely the Atlantic relished the idea of adding an African-American to their masthead. Was that Affirmative Action? If so, was it different than what happens, say, at Harvard? Was it bad?

How much does Affirmative Action actually affect white workers? How much discrimination are they actually suffering? In what spheres is this discrimination most prevalent? Are poor whites actually losing out to "people of color?" Do we have any stats on how many people have been affected by Affirmative Action? How broad is its impact?

I'm not really interested in answering any of these questions here and now, so much as I'm interested in asserting their validity, and asserting that they will always be ill-served by an 800 word op-ed with an inflammatory title. My sense is that there are answers to all of these queries. But I don't think we much care to have them. Jim Webb's piece, most regrettably, followed in the tradition of Henry Louis Gates' column on reparations, in that it is a sign post, a line of demarcation. An exclamation point, as opposed to a question mark.

The "conversation around race" is, itself, a kind of tribalism, wherein you look for ways to justify -- instead of interrogate -- your most elemental feelings.
Loury and McWhorter also discuss the Webb column at some length in their dialogue, and also consider the questions posed there to be serious and worthy. In addition, they feel that the column is indicative of a major shift in the manner of public expression regarding race. Here's Glenn again:
I think that the whole regime of genuflection at the altar of correct racial expression is on the verge of collapse… I sense in the air around me here, in the kinds of things that people are saying… Obama’s ascendancy … has contributed to that… as Obama’s success has made it easier for people to breach the etiquette of racial expression, the conversation is going to get a little rougher… people are going to let go of stuff that they’ve been holding on to for a while.
John puts it as follows: "there’s a sea change coming... we can predict, when people start letting it out… a lot of it is going to be pent up… some of it is going to come out infelicitously phrased." 
The conversation that Coates has given up on is happening already, and he is at the heart of it (as are Loury and McWhorter). But it is being drowned out by the shrillest voices, and I fear that the yelling will be ramped up for a while longer before it eventually subsides.

---

Sara Mayeux's response to the Webb column is also worth reading; she argues that the inflammatory title may well have been chosen by an "overzealous copyeditor." My own thoughts on the Gates arrest may be found here, and some reflections on an extraordinary essay by Ralph Ellison (that demonstrates brilliantly the essential nature of the "African American thread in the broader American quilt") here.

Glenn and I co-taught a course on Group Inequality at the Universidad de Los Andes in July, and were interviewed by Dinero while in Bogotá. (The interview was conducted in English, and translated for publication.)

---

Update (8/18). Maxine Udall has a characteristically thoughtful follow-up.

Saturday, August 07, 2010

History Versus Expectations in Sub-Saharan Africa

Ngozi Okonjo-Iweala believes that sub-Saharan Africa "is on the verge of joining the ranks" of the so-called BRIC nations:
What trillion dollar economy has grown faster than Brazil and India between 2000 and 2010 in nominal dollar terms and is projected by the IMF to grow faster than Brazil between 2010 and 2015? The answer may surprise you: it is Sub-Saharan Africa... At a time when Asian equity and debt markets are saturated and no longer offer substantial returns, SSA could be poised to provide the best global risk-return profile.
She supports these claims with a wealth of data on recent trends in growth, inflation, exchange reserves, foreign direct investment flows, receipts from international tourism, spreading democratization, declining gender disparities, and improved security.
But Ngozi is also careful to note that this projected take-off is by no means a foregone conclusion. She argues that a concerted development effort is necessary, including a "big push" on investments in education and infrastructure. In order to finance this, she proposes the diversion by donor nations of a portion of anticipated future foreign aid in order to back a current bond issue, effectively securitizing these flows. While such an initiative would help close an infrastructure funding gap over the next few years, Ngozi maintains that its most important effect would be to "change perceptions overnight about Africa as a place to do business."
The idea that coordinated optimism about the future prospects of a region could be a critical determinant of its subsequent growth performance was advanced in a classic paper by Rosenstein-Rodan in 1943, building on earlier work by Allyn Young. The argument is based on the fact that the development of any particular industry may only be privately profitable if an entire set of interlocking industries were emerging simultaneously. Hence the need for a "big push":
Complementarity of different industries provides the most important set of arguments in favour of a large-scale planned industrialisation... It might easily happen that any one enterprise would not be profitable enough to guarantee payment of sufficient interest or dividend out of its own profits. But the creation of such an enterprise... may create new investment opportunities and profits elsewhere... If we create a sufficiently large investment unit by including all the new industries of the region, external economies will become internal profits out of which dividends may be paid easily.
Or, as Paul Krugman put it in his influential paper on history versus expectations, "the doctrine of Rosenstein-Rodan" entails the following claim: "the willingness of firms to invest depends on their expectation that other firms will invest, so that the task of development policy is to create convergent expectations around high investment." 
Krugman's goal in that paper was to point out that there are a many contexts in which multiple equilibria of the kind that concerned Rosenstein-Rodan arise, and to address the question of how one of these is eventually selected: 
Once one has multiple equilibria, however, there is an obvious question: which equilibrium actually gets established? Although few have emphasized this point, there is a broad division into two camps... On one side is the belief that the choice among multiple equilibria is essentially resolved by history: that past events set the preconditions that drive the economy to one or another steady state... On the other side, however, is the view that the key determinant of choice of equilibrium is expectations: that there is a decisive element of self-fulfilling prophecy...
The distinction between history and expectations as determinants of the eventual outcome is an important one. Both a world in which history matters and a world of self-fulfilling expectations are different from the standard competitive view of the world, but they are also significantly different from each other. Obviously, also, there must be cases in which both are relevant. Yet in the recent theoretical literature models have tended to be structured in such a way that either history or expectations matter, but not both... in the real world, we would expect there to be circumstances in which initial conditions determine the outcome, and others in which expectations may be decisive. But what are these circumstances? 
It's clearly an important question, and in order to address it Krugman builds a simple two-sector model with increasing returns in one sector and constant returns in the other. There are two long run equilibria, each of which involves complete specialization in one of the two goods. If the initial state of the economy involves incomplete specialization there will be movement of resources across sectors. But in which direction? 
If shifts in resources across sectors cannot be costlessly reversed, then such movements will depend not only on current inter-sectoral wage differences, but also on anticipated future differentials, which in turn depend on expectations about the future movements of resources across sectors. Krugman shows that there is a range of initial conditions, which he calls the overlap, from which either one of the two long run states can be approached if and only if it is expected to be realized.  If initial conditions lie outside this range, then history is decisive, otherwise expectations matter a great deal in affecting the eventual pattern of specialization.
The overlap may be viewed as a zone of uncertainty within which coordinated optimism can have major economic effects. Outside this zone, for better or worse, we are shackled by our history. Within it, expectations become crucially important. 
The question, then, is whether or not sub-Saharan Africa is now in or around this zone of uncertainty where expectations can be a critical determinant of its future development performance. Shanta Devarajan has recently pointed to a number of success stories that seem to suggest the stirrings of something major:
In recent years, a broad swath of African countries has begun to show a remarkable dynamism.  From Mozambique’s impressive growth rate (averaging 8% p.a. for more than a decade) to Kenya’s emergence as a major global supplier of cut flowers, from M-pesa’s mobile phone-based cash transfers to KickStart’s low-cost irrigation technology for small-holder farmers, and from Rwanda’s gorilla tourism to Lagos City’s Bus Rapid Transit system, Africa is seeing a dramatic transformation.  This favorable trend is spurred by, among other things, stronger leadership, better governance, an improving business climate, innovation, market-based solutions, a more involved citizenry, and an increasing reliance on home-grown solutions.  More and more, Africans are driving African development.
I quoted this passage in an earlier post as a counterpoint to William Easterley's rather startling claim that "78 percent of the difference in income today between sub-Saharan Africa and Western Europe is explained by technology differences that already existed in 1500 AD – even before the slave trade and colonialism." My quarrel is not with this statement as an empirical claim, but rather with its implication that the heavy hand of history will continue to weigh inexorably upon the region. Sometimes we are bound by the past and sometimes not, and it is important to recognize opportunities to break free when they arise. And such an opportunity may well be emerging right before our eyes in Africa.

---

I am grateful to E. Somanathan for reminding me of the paper by Rosenstein-Rodan, and to Joao Farinha for his thoughtful and informative responses to my earlier post on this topic.  

Wednesday, July 28, 2010

Equilibrium Analysis

In a recent post on his (consistently interesting) blog, David Murphy questions the value of equilibrium analysis in economics and finance, and points to two earlier posts of his in which the same point is made. Here he is in July 2007:
An interesting post on the Street Light Blog, on currency misalignments, suggests an interesting question: is economics an equilibrium discipline? The very idea of a misaligned FX rate suggests that the natural state is an aligned one: perhaps the fundamentals move faster than the markets adjust, so FX is never in equilibrium. Perhaps (in the language of statistical mechanics) the relaxation time is much longer than the average time between forcings. 
And here, in August 2008:
My own view is that finance is not an equilibrium discipline, mostly, so while classical economics might work well in explaining the price of coffee... it does rather less well in asset allocation or explaining the return distribution of financial assets. Rather new news arrives faster than the market can restore equilibrium after the last perturbation, meaning that most of the time equilibrium is not a useful concept.
In a 1975 paper that remains worth reading to this day, James Tobin was explicit about the limitations of equilibrium analysis in understanding large scale economic fluctuations:
Keynes's General Theory attempted to prove the existence of equilibrium with involuntary unemployment, and this pretension touched off a long theoretical controversy. A. C. Pigou, in particular, argued effectively that there could not be a long-run equilibrium with excess supply of labor. The predominant verdict of history is that, as a matter of pure theory, Keynes failed to prove his case.

Very likely Keynes chose the wrong battleground. Equilibrium analysis and comparative statics were the tools to which he naturally turned to express his ideas, but they were probably not the best tools for his purpose... The real issue is not the existence of a long-run static equilibrium with unemployment, but the possibility of protracted unemployment which the natural adjustments of a market economy remedy very slowly if at all. So what if, within the recherché rules of the contest, Keynes failed to establish an "underemployment equilibrium"? The phenomena he described are better regarded as disequilibrium dynamics.
Tobin then goes on to develop a dynamic disequilibrium model of the macroeconomy (discussed at length here) which has a unique equilibrium characterized by full employment, steady inflation, and correct expectations. He shows that even if this equilibrium is locally stable, so that small perturbations are self-correcting, it need not be globally stable: sufficiently large shocks to the economy can result in cumulative divergence away from equilibrium unless arrested by a significant policy response. This seems to describe what we have experienced over the past couple of years better than any equilibrium model of which I am aware.
Note that Tobin's model is deterministic. The problem here is not that the economy is being buffeted by frequent shocks that arrive before a transition to equilibrium can occur, it is that the internal dynamics of adjustment simply do not approach the equilibrium from certain (large) sets of initial states even in the absence of shocks. The idea that the instability of steady growth with respect to disequilibrium dynamics is an important feature of modern market economies, and cannot be neglected in a comprehensive theory of economic fluctuations was forcefully advanced by Richard Goodwin as far back as 1951, and Paul Samuelson had explored the possibility even earlier. As Willem Buiter has recently lamented, this line of research in macroeconomics simply dried up about a generation ago.
Another area in which equilibrium analysis is likely to be inadequate is in the study of asset markets with significant speculative activity. Price and volume dynamics in such markets depend not just on changes in fundamentals but also on the distribution of trading strategies, and this in turn adjusts under pressure of differential performance. The idea of an equilibrium composition of trading strategies is a contradiction in terms: if there were any such thing there would be a new strategy that could enter to exploit the resulting regularity. It is the complexity of this disequilibrium process that allows information arbitrage efficiency to be approximately satisfied, while allowing for significant departures from fundamental valuation efficiency (the distinction, naturally, is also due to Tobin.)
Finally consider Hyman Minsky's financial instability hypothesis, built on the paradoxical idea that stability itself can be destabilizing. In Minsky's framework stable expansions give rise to increasingly aggressive financial practices as those firms having the greatest maturity mismatch between assets and liabilities profit relative to their closest competitors. The resulting erosion in margins of safety increases financial fragility, interpreted as the likelihood that a major default will trigger a crisis of liquidity. Such a crisis eventually materializes, devastating precisely those firms whose actions gave rise to greater fragility. The balance of financial practices is then shifted in favor of increased prudence, and the stage is set for another period of stability. Trying to give this analysis an equilibrium interpretation is a futile exercise; expectations of financial market tranquility are self-falsifying, and no fixed distribution of financial practices can be stable. 
Given the potential of disequilibrium dynamic models to illuminate our understanding of the economy, why are they generally neglected in contemporary economics? In part it is because the quality of a disequilibrium model is hard to evaluate and the dynamics are necessarily arbitrary to a degree. There is a professional consensus on how equilibrium analysis should be done, but none (so far) when it comes to disequilibrium analysis. Furthermore, equilibrium models can be enormously insightful, even in applications to macroeconomics and finance. The work of John Geanakoplos on the leverage cycle is a case in point, and Abreu and Brunnermeier's paper on bubbles and crashes is another. I have used equilibrium methods frequently and will continue to do so. But it seems that there ought to be greater space in the profession for serious work on the dynamics of disequilibrium.

---

Update (7/31). In an email (posted with permission) David Murphy adds:
One of the main reasons people study equilibrium models is that they are an order of magnitude easier, mathematically, than non-equilibrium. If you consider a simple problem like cooling, for instance, the equilibrium version is high school physics, and the non-equilibrium version is still a research problem (with useful application to the improvement of annealing methods). Economists in my experience are very comfortable with the maths they know (a bit - stress a bit - of stochastic calculus), but they are not willing to venture much further because it gets really, really hard quite quickly.  Hence the 'we have a hammer, everything looks like a nail' problem.
I think this is correct as far as analytical results are concerned; proving theorems (aside from convergence-to-equilibrium results) with the standard toolbox is not easy in disequilibrium models.  But if one adopts a computational approach the reverse may be true. I, for one, find it easier to write an algorithm to simulate a recursive system than one that requires the computation of fixed points in high dimensional spaces. But (as noted above) there does not exist anything close to a professional consensus on how the quality of such models should be evaluated, and so they are usually rejected out of hand at most mainstream journals.

Sunday, July 25, 2010

East Asian Tigers and African Lions

William Easterly has recently argued that contemporary poverty in African nations may largely be accounted for by technological differences that date back for centuries, if not millennia:
1500 AD technology is a particularly powerful predictor of per capita income today. 78 percent of the difference in income today between sub-Saharan Africa and Western Europe is explained by technology differences that already existed in 1500 AD – even before the slave trade and colonialism. Moreover, these technological differences had already appeared by 1000 BC. The state of technology in 1000 BC has a strong correlation with technology 2500 years later, in 1500 AD...
A large role for history is still likely to sit uncomfortably with modern development practitioners, because you can’t change your history. But we have to face the world as it is, not as we would like it to be...
In a recent speech in Kampala, Gordon Brown offered a prognosis (coupled with a long list of policy recommendations) that was decidedly less gloomy about the future of Africa. Building on the observation that the continent is "full of more untapped potential and unrealised talent than any other," Brown continued as follows:
Twenty years ago nobody would have predicted that China and India would be the big drivers of growth and political superpowers they have become. And there is no reason to believe the countries of Africa cannot make similar leaps in the decades to come.... just as people have spoken of an American century and an Asian century, I believe we can now speak of an African century...
I believe the new African growth will come from five sources;
  • a faster pace of economic integration in Africa's internal market, and between your market and those of other continents, facilitated by investment in infrastructure
  • a broader based export-led growth, founded on new products and services
  • investment in the private sector from African and foreign sources in firms that create jobs and wealth
  • the up-skilling of the workforce, including through the acceleration of education provision, IT infrastructure and uptake and finally through
  • more effective governance to ensure that effective states can discharge their task of creating growth and reducing poverty
Each of these five priorities will be difficult to achieve. But we should remember the value of the prize. Because if we can agree a new model of post-crisis growth then Africa - already a 1.6 trillion economy - will continue to grow even faster than the rest of the world. This is not my assessment, but that of the world's leading companies and analysts. For example a report just published by the McKinsey Global Institute claims that Africa's consumer spending could reach 1.4 trillion dollars by 2020 - a 60% increase on 2008. In other words in ten years African consumer spending will be as big as the whole African economy is today.

It is those sorts of projections which mean people are now rightly talking not just of East Asian tigers, but of African lions.
Brown is careful to note that this rosy scenario is a "possibility rather than a probability" and that "it will happen through choice not chance." But, as Shanta Devarajan has recently observed, the choices necessary to make it happen are already being made:
In recent years, a broad swath of African countries has begun to show a remarkable dynamism.  From Mozambique’s impressive growth rate (averaging 8% p.a. for more than a decade) to Kenya’s emergence as a major global supplier of cut flowers, from M-pesa’s mobile phone-based cash transfers to KickStart’s low-cost irrigation technology for small-holder farmers, and from Rwanda’s gorilla tourism to Lagos City’s Bus Rapid Transit system, Africa is seeing a dramatic transformation.  This favorable trend is spurred by, among other things, stronger leadership, better governance, an improving business climate, innovation, market-based solutions, a more involved citizenry, and an increasing reliance on home-grown solutions.  More and more, Africans are driving African development.
Shanta links to a long list of emerging African success stories.  
While the economic consequences of an African resurgence will be major, the social implications could be even more profound. I believe that the rise of the African lions will do more to shatter racial stereotypes in the United States and elsewhere than any government policy or electoral outcome. But that is a topic for another post.

---

Update (7/25). I do not dispute the empirical claims made by Comin, Easterly and Gong, nor mean to suggest that that Brown's speech and Devarajan's post have any bearing on these claims. But I have serious doubts about the relevance of their findings for identifying future centers of economic dynamism or for shaping development policy. History can matter for long periods of time (for instance in occupational inheritance or the patrilineal descent of surnames) and then cease to constrain our choices in any significant way. Once reliable correlations can break down suddenly and completely; history is full of such twists and turns. As far as African prosperity is concerned, I believe that a discontinuity of this kind is inevitable if not imminent.

From an overview of the McKinsey report referenced by Brown:
While Africa's increased economic momentum is widely recognized, less known are its sources and likely staying power... Africa's growth acceleration was widespread, with 27 of its 30 largest economies expanding more rapidly after 2000. All sectors contributed, including resources, finance, retail, agriculture, transportation and telecommunications. Natural resources directly accounted for just 24 percent of the continent's GDP growth from 2000 through 2008. Key to Africa's growth surge were improved political and macroeconomic stability and microeconomic reforms... total foreign capital flows into Africa rose from $15 billion in 2000 to a peak of $87 billion in 2007... Today the rate of return on foreign investment in Africa is higher than in any other developing region.

Sunday, July 18, 2010

David Blackwell, 1919-2010

The renowned mathematician David Blackwell died on July 8 at the age of 91.
I first came across Blackwell's name in a widely-cited paper by Kalai and Lehrer on learning in repeated games. Kalai and Lehrer identified conditions under which players with different initial subjective beliefs about each others' strategies will nevertheless converge to behavior that approximates a Nash equilibrium of the repeated game. In establishing this, the authors relied heavily on the Blackwell-Dubins Theorem:
Our proof of the convergence to playing an ε-Nash equilibrium is divided into three steps. The first establishes a general self-correcting property of Bayesian updating. This is a modified version of the seminal Blackwell and Dubins' (1962) result about merging of opinions... When applied to our model, the self-correcting property shows that the probability distributions describing the players' beliefs about the future play of the game must converge to the true distribution. In other words, the beliefs and the real play become realization equivalent.
While Blackwell's work is familiar in economics largely through this result, he is also known for the Rao-Blackwell Theorem and his book (with Meyer Girshick) on the Theory of Games and Statistical Decisions. [Update: A much fuller discussion of his influence and contributions may be found here.]
Blackwell earned his doctorate in mathematics at the age of 22 from the University of Illinois, where his thesis adviser was Joseph Doob. He then spent a year at the Institute for Advanced Study in Princeton, where his cohort included Shizuo Kakutani, Paul Halmos, Leonard Savage, and Alfred Tarski. He was elected to the National Academy of Sciences in 1965 and was the sole recipient of the John von Neumann Theory Prize in 1979 (sandwiched between Nash and Lemke in 1978 and Gale, Kuhn and Tucker in 1980).
While his accomplishments are stellar and many, it is also worth contemplating the many slights that Blackwell had to endure over the course of his career:
Blackwell was appointed a Postdoctoral Fellow at the Institute for Advanced Study from 1941 for a year. At that time, members of the Institute were automatically officially made visiting fellows of Princeton University, and thus Blackwell was listed in its bulletin as such. This caused considerable ruckus as there had never been a black student, much less faculty fellow, at the University... The president of Princeton wrote the director of the Institute that the Institute was abusing the University's hospitality by admitting a black... Colleagues in Princeton wished to extend Blackwell's appointment at the institute. However, the president of Princeton organized a great protestation... When it was time to leave the institute, Blackwell knew no white schools would hire him, and he applied to all 105 Black schools in the country. After instructorships at Southern University and Clark College, Dr. Blackwell joined the faculty of Howard University from 1944 as an instructor... In three years, Blackwell had risen to the rank of Full Professor and Chairman.
Blackwell eventually moved to Berkeley in 1954 (after having previously been denied a position there due to "racial objections"). He became the first black professor to be tenured there, chaired the department of statistics from 1957 to 1961, and remained at the University until his retirement in 1988.
It takes a particular kind of strength to manage such a productive research career while tolerating the stresses and strains of personal insult, and carrying the aspirations of so many on one's shoulders. Blackwell was more than a brilliant mathematician, he was also a human being of extraordinary personal fortitude.
---

I am currently in Bogotá co-teaching a course with Glenn Loury at the (very impressive) Universidad de Los Andes. I am grateful to Glenn for bringing to my attention the news that Blackwell had recently passed away.

---

Update (7/19). Jeff Ely has linked to two other posts in appreciation of Blackwell: Eran Shmaya focuses on his work while Jesús Fernández-Villaverde writes (in Spanish) about his life. Both are well worth reading. Here's an extract from Eran's wonderful post:
We game theorists know Blackwell for several seminal contributions. Blackwell’s approachability theorem is at the heart of Aumann and Maschler’s result about repeated games with incomplete information... Blackwell’s theory of comparison of experiments has been influential in the game-theoretic study of value of information... Another seminal contribution of Blackwell, together with Lester Dubins, is the theorem about merging of opinions, which is the major tool in the Ehuds’ theory of Bayesian learning in repeated games. And then there are his contributions to the theory of infinite games with Borel payoffs (now known as Blackwell games) and Blackwell and Fergurson’s solution to the Big Match game.

One conspicuous aspect of many of Blackwell’s awesome papers is that they are extremely short — often a couple of pages long. He had an amazing ability to prove theorems in the right way, and he wrote with eloquence and clarity. He is the only writer I know who uses the 'as the reader can verify' trick productively, exactly at those occasions when the reader will indeed find it easier to convince himself in the validity of an assertion than to read a formal proof of it. It is very rare that I succeed in reading proofs in papers that were written dozens of years ago: Notations and perspectives change, and important results are usually reproduced in clearer way over the years. But Blackwell’s papers are still the best place to read the proofs of his theorems...

I hope I am not forcing my own agenda on Blackwell’s research when I say that for him game and decision theory were a tool to study conceptual questions about the meaning of probability and information. At any rate, he was clearly interested in these questions... I hope the game theory society will find a way to celebrate Blackwell’s contribution to our community.
Kevin Bryan has also put up a nice post on Blackwell.

---

Update (7/20). Stergios (in a comment on this post) observes that "Blackwell also made fundamental contributions to the theory of stochastic processes" and that his "renewal theorem is taught in any doctoral level course on stochastic models." And Glenn Loury has emailed me a link to a paper by Jacques Crémer "nicely expositing one of Blackwell's more influential results in statistical decision theory."

Andrew Gelman (and his commenters) have more. And Anandaswarup Gadde links to a wonderful profile from about a year ago:
Doob’s foundational work would help broaden the field of mathematics to a dizzying array of uses in science, economics and technology. So it came as no surprise when in 1942, Jerzy Neyman of the University of California at Berkeley asked if Doob were interested in going West.
“No, I cannot come, but I have some good students, and Blackwell is the best,” he replied.

“But of course he’s black,” Doob continued, “and in spite of the fact that we are in a war that’s advancing the cause of democracy, it may not have spread throughout our own land.”

The quote, repeated in the book “Mathematical People,” says a lot about the times and even more about David H. Blackwell... who started as an Illinois undergraduate in 1935 and finished with a doctoral degree six years later, all accomplished at a time when residence halls were whites-only, and approximately 100 blacks were included in the student body of nearly 12,000.

What would be the odds of the son of a railroad worker from Centralia – whose parents did not complete high school and whose Depression-era teaching prospects were limited to segregated schools – becoming one of the top theoretical mathematicians (black or white) in the world?

Almost too hard to compute...

After earning a UI doctoral degree in mathematics in 1941 at the age of 22, Blackwell completed a year at the Institute for Advanced Study in Princeton, N.J., where he worked with, among others, John von Neumann, father of modern game theory.
Berkeley’s Jerzy Neyman – who had been unable to persuade Doob to join his department – wanted to offer Blackwell a position but appeared to have come up against a deal-breaker.

In an oral history interview at Berkeley, Blackwell, now 90 years old and in “fair” health, recalled what he learned years later – that the Texan wife of the department head told her husband she “was not going to have that darky in her house.”

The job offer never came.

Blackwell focused his efforts instead on realistic career aspirations for a person of color at the time. In 1942 he applied to 105 historically black colleges, received three offers and eventually landed at Howard University in Washington, D.C., in 1944, where he remained for 10 years...

Back at Berkeley, Neyman had never forgotten Blackwell and finally hired him in 1954, where he would stay for the remainder of his career.
Read the whole thing.