Monday, October 11, 2010

Glenn Loury on Peter Diamond

Glenn Loury has kindly forwarded me a letter he wrote earlier this year in appreciation of Peter Diamond, one of the co-recipients of this year's Nobel Memorial Prize in Economics. The tribute was written for the occasion of Diamond's retirement, and seems worth publishing today:
April 20, 2010
Prof. James Poterba, Chair
Department of Economics
Massachusetts Institute of Technology

Dear Jim:

It is a pleasure to contribute a brief note of tribute to Peter Diamond, on this occasion of celebration for his work as scholar and teacher.

Peter was an inspiration and role model for me during my student years at MIT. My encounters with him -- in the classroom and in his office -- left an indelible impression. I recall going over to the Dewey Library shortly after arriving in Cambridge, in the summer of 1972, and digging out Peter's doctoral dissertation. This was a mistake! Peter's reputation as a powerful theorist had been noted by my undergraduate teachers at Northwestern. I wanted to see how this reputed superstar had gotten his start. Just how good could it be, I wondered? I had no idea! What I discovered was an elegant, profound and exquisitely argued axiomatic treatment of the general problem of representing consumption preferences over an infinite time horizon, extending results obtained by his undergraduate teacher and the future Nobel Laureate, Tjallings Koopmans.

I prided myself on being a budding mathematician in those years. Yet, Peter's effortless mastery in that dissertation of the relevant techniques from topology and functional analysis, and his successful application of those methods to a problem of fundamental importance in economic theory -- all accomplished by age 23, younger than I was at the moment I held his thesis binder in my hands! – was simply stunning. This set what seem to me then, and still seems so now, to be an unapproachable standard. I was depressed for weeks thereafter!

Even more depressing was what I discovered as I got to know Peter better over the course of my first two years in the program: that mathematical technique was not even his strongest suit! An unerring sense of what constitute the foundational theoretical questions in economic science, and a rare creative gift of being able to imagine just the right formal framework in the context of which such questions can be posed and answered with generality -- this, I came to understand, is what Peter Diamond was really good at.

And so, I learned from him in those years what turned out to be the most important lesson of my graduate educational experience -- that, in the doing of economic theory and relative to the behavioral significance of the issue under investigation, technique is always a matter of secondary importance -- neither necessary nor sufficient for the production of lasting insights. I learned this from the careful study of Peter's seminal contributions to growth theory, the theories of taxation and social insurance, the theories of choice under uncertainty and the allocation of risk-bearing, the theories of legal rules and institutions, and the theory of unemployment. I also learned this from Peter's elegant and comprehensive lectures on the work in these areas of himself and that of other scholars. And so I came -- slowly and fitfully, because I was rather attached to the joys of doing mathematics for its own sake -- to see the world the way that Peter Diamond saw it. And, in the process, I became a much better economist.

Peter graciously agreed to be the second reader on my dissertation, even though I was writing outside of his areas of specialization at the time, and my intellectual indebtedness to him only increased over the course of my last two years at MIT. It has by now become rather clear that I shall never be able to discharge that debt.

So, thanks Peter, for your extraordinary generosity as a teacher, and for your unmatched example as a scholar.

Glenn C. Loury
Merton P. Stoltz Professor of the Social Sciences
Professor of Economics and of Public Policy
Brown University
The following passage from the letter is worth repeating:
And so, I learned from him in those years what turned out to be the most important lesson of my graduate educational experience -- that, in the doing of economic theory and relative to the behavioral significance of the issue under investigation, technique is always a matter of secondary importance -- neither necessary nor sufficient for the production of lasting insights.
I have had very little time for blogging recently, thanks to two new courses, but if I can find the time I'd like to write a post on Diamond's classic 1982 paper on search, and the wonderful coconut parable he used in order to illuminate the theory.

Tuesday, October 05, 2010

Hot Potatoes

RT Leuchtkafer follows up on his earlier remarks with a comment in the Financial Times:
After a detailed four-month review of the flash crash, looking at market data streams tick-by-tick and down to the millisecond, the SEC concluded that a single order in the e-mini S&P 500 futures market ignited an inferno of panic selling. It was over in about seven minutes, and $1,000bn was up in smoke.
Within hours of the SEC’s report, the CME Group, owner of the Chicago Mercantile Exchange, issued a statement to point out that the suspect e-mini order was entirely legitimate, that it came from an institutional asset manager (that is, the public), and was little more than 1 per cent of the e-mini’s daily volume and less than 9 per cent of e-mini volume during and immediately after the crash.
How did this small bit of total volume cause such a conflagration?
You do it with computers. Specifically, you do it with unregulated computers. You pay rent so your machines sit inside the exchanges, minimising travel time for your electrons. You pay licence fees so your computers eat their fill of super-fast proprietary data feeds, data containing a shocking amount of information on everyone’s orders, not just on your own.
And when your computers spot trouble, such as a larger than expected sell-off, they dump inventory and they shut down – because they can.
No one knows what a “larger than expected sell-off” might be, but on May 6 a single hedge that added just an extra 9 per cent of selling pressure was enough to cause chaos.
When that happened, the SEC’s report says, high-frequency traders “stopped providing liquidity and began to take liquidity”, starting a frenzied race for anyone willing to buy. The report likened the panic to a downward-spiralling game of “hot potato” where, as HFT firms bought beyond their risk limits, they pulled their own bids and frantically sold to anyone they could, which were often just other HFT firms, who themselves quickly reached their risk limits and tried to sell to anyone they could, and so on – into the abyss. Fratricide ruled the day. Firms then fled the market altogether, accelerating the sell-off.
Punch drunk, markets rebounded when other market participants realised what had just happened and jumped into the market to buy.
Fair enough, some might say. Markets do panic, and sometimes for no reason. But the larger HFT firms register as formal marketmakers, receiving a variety of regulatory advantages, including greater leverage. All of this extends their enormous reach and power. In the past, they fulfilled certain obligations and observed certain restraints as a quid pro quo for those advantages, a quid pro quo intended to keep them in the market when markets were under stress and to prevent them from adding to that stress. Over the past few years, however, decades-long obligations and restraints all but disappeared, while many advantages stayed.
Computing power also opened marketmaking to a field of unregistered, or informal, high-frequency marketmakers, what investor and commentator Paul Kedrosky termed the “shadow liquidity system”. Exchanges will pay you to do it, too, just as they pay formal marketmakers, and require little in return.
The result is a loose confederation of unregulated, or lightly regulated, high-frequency marketmakers. They feed on what many consider confidential order information, play hot potato in volatile markets, and then instantly change the game to hide-and-seek if even a single hedge hits an unseen and unknowable tipping point.
The only quibble I have with this analysis is that too many different classes of algorithmic trading strategies are being bundled together under the HFT banner. In particular I would like to see a distinction made between directional strategies that are based on predicted short term price movements, and arbitrage based strategies that exploit price differentials across assets and markets. Both of these can be implemented with algorithms, rely on rapid responses to incoming market data, and involve very short holding periods. But they have completely different implications for asset price volatility. It is the mix of strategies rather than the method of their implementation that is the key determinant of market stability.

---

Update: Leuchtkafer writes in to say:
I should have been clear in the piece I was talking specifically about market making strategies. 
I appreciate the clarification, and agree with his characterization of the new market makers

Friday, October 01, 2010

RT Leuchtkafer on the Flash Crash Report

The long-awaited CFTC-SEC report on the flash crash has finally been released. I'm still working my way through it, and hope to respond in due course. In the meantime, here is an email (posted with permission) from the very interesting RT Leuchtkafer, whose thoughts on recent changes in market microstructure have been discussed at some length previously on this blog:
It's natural for any critic to focus on what he wants in the report, and I'm no different.

From the report, in the futures market: "HFTs stopped providing liquidity and instead began to take liquidity." (report pp 14-15); "...the combined selling pressure from the Sell Algorithm, HFT's and other traders drove the price of the E-Mini down..." (report p 15)

And in the equities market: "In general, however, it appears that the 17 HFT firms traded with the price trend on May 6 and, on both an absolute and net basis, removed significant buy liquidity from the public quoting markets during the downturn..." (report p 48); "Our investigation to date reveals that the largest and most erratic price moves observed on May 6 were caused by withdrawals of liquidity and the subsequent execution of trades at stub quotes." (p 79)

It's also natural - if ungraceful - for a critic to say "I told you so." OK, I'm no ballerina, and I told you so (April 16, 2010):

"When markets are in equilibrium these new participants increase available liquidity and tighten spreads. When markets face liquidity demands these new participants increase spreads and price volatility and savage investor confidence."

"...[HFT] firms are free to trade as aggressively or passively as they like or to disappear from the market altogether."

"...[HFT firms] remove liquidity by pulling their quotes and fire off marketable orders and become liquidity demanders. With no restraint on their behavior they have a significant effect on prices and volatility....they cartwheel from being liquidity suppliers to liquidity demanders as their models rebalance. This sometimes rapid rebalancing sent volatility to unprecedented highs during the financial crisis and contributed to the chaos of the last two years. By definition this kind of trading causes volatility when markets are under stress."

"Imagine a stock under stress from sellers such was the case in the fall of 2008. There is a sell imbalance unfolding over some period of time. Any HFT market making firm is being hit repeatedly and ends up long the stock and wants to readjust its position. The firm times its entrance into the market as an aggressive seller and then cancels its bid and starts selling its inventory, exacerbating the stock's decline."

"So in exchange for the short-term liquidity HFT firms provide, and provide only when they are in equilibrium (however they define it), the public pays the price of the volatility they create and the illiquidity they cause while they rebalance."

Finally, the report should put paid to the notion that HFT firms are simple liquidity providers and that they don't withdraw in volatile markets, claims that have been floating around for quite a while.

What happens next?
In a follow-up message, Leuchtkafer adds: 
I'd like to note there were many other critics who got it right, including (most importantly) Senator Kaufman, Themis Trading, David Weild, and others. They all deserve a shout out.
To this list I would add Paul Kedrosky.
Firms that began to "take liquidity" during the crash would have suffered significant losses were it not for the fact that many of their trades were subsequently broken. I have argued repeatedly that this cancellation of trades was a mistake, not simply on fairness grounds but also from the perspective of market stability:
By canceling trades, the exchanges reversed a redistribution of wealth that would have altered the composition of strategies in the trading population. I'm sure that many retail investors whose stop loss orders were executed at prices far below anticipated levels were relieved. But the preponderance of short sales among trades at the lowest prices and the fact that aberrant price behavior also occurred on the upside suggests to me that the largest beneficiaries of the cancellation were proprietary trading firms making directional bets based on rapid responses to incoming market data. The widespread cancellation of trades following the crash served as an implicit subsidy to such strategies and, from the perspective of market stability, is likely to prove counter-productive. 
The report does appear to confirm that some of the major beneficiaries of the decision to cancel trades were algorithmic trading outfits. But I need to read it more closely before offering further comment. 

Saturday, September 04, 2010

Economic Consequences of Speculative Side Bets

The following column was written jointly with Yeon-Koo Che and is crossposted from Vox EU with minor edits and links to references.
---
There is arguably no class of financial transactions that has attracted more impassioned commentary over the past couple of years than naked credit default swaps. Robert Waldmann has equated such contracts with financial arson, Wolfgang Münchau with bank robberies, and Yves Smith with casino gambling. George Soros argues that they facilitate bear raids, as does Richard Portes who wants them banned altogether, and Willem Buiter considers them to be a prime example of harmful finance. In sharp contrast, John Carney believes that any attempt to prohibit such contracts would crush credit markets, Felix Salmon thinks that they benefit distressed debtors, and Sam Jones argues that they smooth out the cost of borrowing over time, thus reducing interest rate volatility.
One reason for the continuing controversy is that arguments for and against such contracts have been expressed informally, without the benefit of a common analytical framework within which the economic consequences of their use can be carefully examined. Since naked credit default swaps necessarily have a long and a short side and the aggregate payoff nets to zero, it is not immediately apparent why their existence should have any effect at all on the availability and terms of financing or the likelihood of default. And even if such effects do exist, it is not clear what form and direction they take, or the implications they have for the allocation of a society's productive resources.
In a recent paper we have attempted to develop a framework within which such questions can be addressed, and to provide some preliminary answers. We argue that the existence of naked credit default swaps has significant effects on the terms of financing, the likelihood of default, and the size and composition of investment expenditures. And we identify three mechanisms through which these broader consequences of speculative side bets arise: collateral effects, rollover risk, and project choice.
A fundamental (and somewhat unorthodox) assumption underlying our analysis is that the heterogeneity of investor beliefs about the future revenues of a borrower is due not simply to differences in information, but also to differences in the interpretation of information. Individuals receiving the same information can come to different judgments about the meaning of the data. They can therefore agree to disagree about the likelihood of default, interpreting such disagreement as arising from different models rather than different information. As in prior work by John Geanakoplos on the leverage cycle, this allows us to speak of a range of optimism among investors, where the most optimistic do not interpret the pessimism of others as being particularly informative. We believe that this kind of disagreement is a fundamental driver of speculation in the real world.
When credit default swaps are unavailable, the investors with the most optimistic beliefs about the future revenues of a borrower are natural lenders: they are the ones who will part with their funds on terms most favorable to the borrower. The interest rate then depends on the beliefs of the threshold investor, who in turn is determined by the size of the borrowing requirement. The larger the borrowing requirement, the more pessimistic this threshold investor will be (since the size of the group of lenders has to be larger in order for the borrowing requirement to be met). Those more optimistic than this investor will lend, while the rest find other uses for their cash.
Now consider the effects of allowing for naked credit default swaps. Those who are most pessimistic about the future prospects of the borrower will be inclined to buy naked protection, while those most optimistic will be willing to sell it. However, pessimists also need to worry about counterparty risk - if the optimists write too many contracts they may be unable to meet their obligations in the event that a default does occur, an event that the pessimists consider to be likely. Hence the optimists have to support their positions with collateral, which they do by diverting funds that would have gone to borrowers in the absence of derivatives. The borrowing requirement must then be met by appealing to a different class of investors, who are neither so optimistic that they wish to sell protection, nor so pessimistic that they wish to buy it. The threshold investor is now clearly more pessimistic than in the absence of derivatives, and the terms of financing are accordingly shifted against the borrower. As a result, for any given borrowing requirement, the bond issue is larger and the price of bonds accordingly lower when investors are permitted to purchase naked credit default swaps.
This effect does not arise if credit default swaps can only be purchased by holders of the underlying security. In fact, it can be shown that allowing for only “covered” credit default swaps has much the same consequences as allowing optimists to buy debt on margin: it leads to higher bond prices, a smaller issue size for any given borrowing requirement, and a lower likelihood of eventual default. While optimists take a long position in the debt by selling such contracts, they facilitate the purchase of bonds by more pessimistic investors by absorbing much of the credit risk. In contrast with the case of naked credit default swaps, therefore, the terms of lending are shifted in favor of the borrower. The difference arises because pessimists can enter directional positions on default in one case but not the other.
While this simple model sheds some light on the manner in which the terms of financing can be affected by the availability of credit derivatives, it does not deal with one of the major objections to such contracts: the possibility of self-fulfilling bear raids. To address this issue it is necessary to allow for a mismatch between the maturity of debt and the life of the borrower. This raises the possibility that a borrower who is unable to meet contractual obligations because of a revenue shortfall can roll over the residual debt, thereby deferring payment into the future.
As many economists have previously observed, multiple self-fulfilling paths arise naturally in this setting (see, for instance, Calvo, Cole and Kehoe, and Cohen and Portes). If investors are confident that debt can be rolled over in the future they will accept lower rates of interest on current lending, which in turn implies reduced future obligations and allows the debt to be rolled over with greater ease. But if investors suspect that refinancing may not be available in certain states, they demand greater interest rates on current debt, resulting in larger future obligations and an inability to refinance if the revenue shortfall is large.
A key question then is the following: how does the availability of naked credit default swaps affect the range of borrowing requirements for which pessimistic paths (with significant rollover risk) exist? And conditional on the selection of such a path, how are the terms of borrowing affected by the presence of these credit derivatives?
For reasons that are already clear from the baseline model, we find that pessimistic paths involve more punitive terms for the borrower when naked credit default swaps are present than when they are not. More interestingly, we find that there is a range of borrowing requirements for which a pessimistic path exists if and only if such contracts are allowed. That is, there exist conditions under which fears about the ability of the borrower to repay debt can be self-fulfilling only in the presence of credit derivatives. It is in this precise sense that the possibility of self-fulfilling bear raids can be said to arise when the use of such derivatives is unrestricted.
The finding that borrowers can more easily raise funds and obtain better terms when the use of credit derivatives is restricted does not necessarily imply that such restrictions are desirable from a policy perspective. A shift in terms against borrowers will generally reduce the number of projects that are funded, but some of these ought not to have been funded in the first place. Hence the efficiency effects of a ban are ambiguous. However, such a shift in terms against borrowers can also have a more subtle effect with respect to project choice: it can tilt managerial incentives towards the selection of riskier projects with lower expected returns. This happens because a larger debt obligation makes projects with greater upside potential more attractive to the firm, as more of the downside risk is absorbed by creditors.
The central message of our work is that the existence of zero sum side bets on default has major economic repercussions. These contracts induce investors who are optimistic about the future revenues of borrowers, and would therefore be natural purchasers of debt, to sell credit protection instead. This diverts their capital away from potential borrowers and channels it into collateral to support speculative positions. As a consequence, the marginal bond buyer is less optimistic about the borrower's prospects, and demands a higher interest rate in order to lend. This can result in an increased likelihood of default, and the emergence of self-fulfilling paths in which firms are unable to rollover their debt, even when such trajectories would not arise in the absence of credit derivatives. And it can influence the project choices of firms, leading not only to lower levels of investment overall but also in some cases to the selection of riskier ventures with lower expected returns.
James Tobin (1984) once observed that the advantages of greater “liquidity and negotiability of financial instruments” come at the cost of facilitating speculation, and that greater market completeness under such conditions could reduce the functional efficiency of the financial system, namely its ability to facilitate “the mobilization of saving for investments in physical and human capital... and the allocation of saving to their more socially productive uses.” Based on our analysis, one could make the case that naked credit default swaps are a case in point.
This conclusion, however, is subject to the caveat that there exist conditions under which the presence of such contracts can prevent the funding of inefficient projects. Furthermore, an outright ban may be infeasible in practice due to the emergence of close substitutes through financial engineering. Even so, it is important to recognize that the proliferation of speculative side bets can have significant effects on economic fundamentals such as the terms of financing, the patterns of project selection, and the incidence of corporate and sovereign default.

Saturday, August 28, 2010

Lessons from the Kocherlakota Controversy

In a speech last week the President of the Minneapolis Fed, Narayana Kocherlakota, made the following rather startling claim:
Long-run monetary neutrality is an uncontroversial, simple, but nonetheless profound proposition. In particular, it implies that if the FOMC maintains the fed funds rate at its current level of 0-25 basis points for too long, both anticipated and actual inflation have to become negative. Why? It’s simple arithmetic. Let’s say that the real rate of return on safe investments is 1 percent and we need to add an amount of anticipated inflation that will result in a fed funds rate of 0.25 percent. The only way to get that is to add a negative number—in this case, –0.75 percent.

To sum up, over the long run, a low fed funds rate must lead to consistent—but low—levels of deflation.
The proposition that a commitment by the Fed to maintain a low nominal interest rate indefinitely must lead to deflation (rather than accelerating inflation) defies common sense, economic intuition, and the monetarist models of an earlier generation. This was pointed out forcefully and in short order by Andy Harless, Nick Rowe, Robert Waldmann, Scott Sumner, Mark Thoma, Ryan Avent, Brad DeLongKarl Smith, Paul Krugman and many other notables.

But Kocherlakota was not without his defenders. Stephen Williamson and Jesus Fernandez-Villaverde both argued that his claim was innocuous and completely consistent with modern monetary economics. And indeed it is, in the following sense: the modern theory is based on equilibrium analysis, and the only equilibrium consistent with a persistently low nominal interest rate is one in which there is a stable and low level of deflation. If one accepts the equilibrium methodology as being descriptively valid in this context, one is led quite naturally to Kocherlakota's corner.

But while Williamson and Fernandez-Villaverde interpret the consistency of Kocherlakota's claim with the modern theory as a vindication of the claim, others might be tempted to view it as an indictment of the theory. Specifically, one could argue that equilibrium analysis unsupported by a serious exploration of disequilibrium dynamics could lead to some very peculiar and misleading conclusions. I have made this point in a couple of earlier posts, but the argument is by no means original. In fact, as David Andolfatto helpfully pointed out in a comment on Williamson's blog, the same point was made very elegantly and persuasively in a 1992 paper by Peter Howitt.

Howitt's paper is concerned with the the inflationary consequences of a pegged nominal interest rate, which is precisely the subject of Kocherlakota's thought experiment. He begins with an old-fashioned monetarist model in which output depends positively on expected inflation (via the expected real rate of interest), realized inflation depends on deviations of output from some "natural" level, and expectations adjust adaptively. In this setting it is immediately clear that there is a "rational expectations equilibrium with a constant, finite rate of inflation that depends positively on the nominal rate of interest" chosen by the central bank. This is the equilibrium relationship that Kocherlakota has in mind: lower interest rates correspond to lower inflation rates and a sufficiently low value for the former is associated with steady deflation. 

The problem arises when one examines the stability of this equilibrium. Any attempt by the bank to shift to a lower nominal interest rate leads not to a new equilibrium with lower inflation, but to accelerating inflation instead. The remainder of Howitt's paper is dedicated to showing that this instability, which is easily seen in the simple old-fashioned model with adaptive expectations, is in fact a robust insight and holds even if one moves to a "microfounded" model with intertemporal optimization and flexible prices, and even if one allows for a broad range of learning dynamics. The only circumstance in which a lower nominal rate results in lower inflation is if individuals are assumed to be "capable of forming rational expectations ab ovo".

Howitt places this finding in historical context as follows (emphasis added):
In his 1968 presidential address to the American Economic Association, Milton Friedman argued, among other things, that controlling interest rates tightly was not a feasible monetary policy. His argument was a variation on Knut Wicksell's cumulative process. Start in full employment with no actual or expected inflation. Let the monetary authority peg the nominal interest rate below the natural rate. This will require monetary expansion, which will eventually cause inflation. When expected inflation rises in response to actual inflation, the Fisher effect will put upward pressure on the interest rate. More monetary expansion will be required to maintain the peg. This will make inflation accelerate until the policy is abandoned. Likewise, if the interest rate is pegged above the natural rate, deflation will accelerate until the policy is abandoned. Since no one knows the natural rate, the policy is doomed one way or another.

This argument, which was once quite uncontroversial, at least among monetarists, has lost its currency. One reason is that the argument invokes adaptive expectations, and there appears to be no way of reformulating it under rational expectations... in conventional rational expectations models, monetary policy can peg the nominal rate... without producing runaway inflation or deflation... Furthermore... pegging the nominal rate at a lower value will produce a lower average rate of inflation, not the ever-higher inflation predicted by Friedman...

Thus the rational expectations revolution has almost driven the cumulative process from the literature. Modern textbooks treat it as a relic of pre-rational expectations thought... contrary to these rational expectations arguments, the cumulative process is not only possible but inevitable, not just in a conventional Keynesian macro model but also in a flexible-price, micro-based, finance constraint model, whenever the interest rate is pegged... the essence of the cumulative process lies not in an economy's rational expectations equilibria but in the disequilibrium adjustment process by which people try to acquire rational expectations... under a wide set of assumptions, the process cannot converge if the monetary authority keeps interest rates pegged... the cumulative process is a manifestation of this nonconvergence. 
Thus the cumulative process should be regarded not as a relic but as an implication of real-time belief formation of the sort studied in the literature on convergence (or nonconvergence) to rational expectations equilibrium... Perhaps the most important lesson of the analysis is that the assumption of rational expectations can be misleading, even when used to analyze the consequences of a fixed monetary regime. If the regime is not conducive to expectational stability, then the consequences can be quite different from those predicted under rational expectations... in general, any rational expectations analysis of monetary policy should be supplemented with a stability analysis... to determine whether or not the rational expectations equilibrium could ever be observed. 
To this I would add only that a stability analysis is a necessary supplement to equilibrium reasoning not just in the case of monetary policy debates, but in all areas of economics. For as Richard Goodwin said a long time ago, an "equilibrium state that is unstable is of purely theoretical interest, since it is the one place the system will never remain."

---

Update (8/29). From a comment by Robert Waldmann:
I think that it is important that in monetary models there are typically two equilibria -- a monetary equilibrium and a non-monetary equilibrium.

The assumption that the economy will end up in a rational expectations equilibrium does not imply that a low nominal interest rate leads to an equilibrium with deflation. It might lead to an equilibrium in which dollars are worthless.

I'd say the experiment has been performed. From 1918 through (most of) 1923 the Reichsbank kept the discount rate low (3.5% IIRC) and met demand for money at that rate.

The result was not deflation. By October 1923 the Reichsmark was no longer used as a medium of exchange.
In fact, the only stable steady state under a nominal interest rate peg in the Howitt model is the non-monetary one.

Thursday, August 19, 2010

On Broken Trades and Bailouts

Back in 1980, Avraham Beja and Barry Goldman published a theoretical paper in the Journal of Finance that explored the manner in which the composition of trading strategies in an asset market affects the volatility of prices. Their main insight was that if the prevalence of momentum based strategies was too large relative to that of strategies based on fundamental analysis, then the dynamics of asset prices would be locally unstable: departures of prices from fundamentals would be amplified rather than corrected over time. More importantly, they argued that the relationship between the composition of strategies and market stability was discontinuous: there was a threshold (bifurcation) value of this population mixture that separated the stable from the unstable regime, and an imperceptible change in composition that took the market across the threshold could result in dramatic increases in volatility.

The Beja/Goldman analysis can be taken a step further: not only does market stability depend on the composition of trading strategies, but the profitability of different trading strategies, and hence changes in their relative population shares over time, depend very much on whether one is in a stable or an unstable regime. In a stable regime prices track fundamentals reasonably well, which makes it possible for technical strategies to extract information from incoming market data without going through the trouble and expense of fundamental research. Such strategies can therefore prosper and proliferate, provide that they remain sufficiently rare. But if they become too common, markets are destabilized, asset price bubbles can form, and the value of fundamental information rises. When a major correction arrives, it is the fundamental strategies that prosper, the composition of trading strategies is shifted accordingly, and market stability is restored for a time. This process of endogenous regime switching provides one possible interpretation of the empirical phenomenon known as volatility clustering.

From this perspective, it is critically important that technical trading strategies to be allowed to suffer losses when market instability arises. The cancellation of trades in almost 300 securities after the flash crash of May 6 did exactly the opposite, by providing an implicit subsidy to destabilizing strategies. The excuse that this was done to protect retail investors whose stop orders were executed as prices fell to insane levels is unconvincing. According to the SEC's own report on the crash, most trades against stub quotes of five cents or less were short sales, and there was also considerable upward instability, with prices rising well beyond the reach of ordinary retail investors. (Shares in Sotheby's, for instance, changed hands at ten million dollars per round lot.) The cancellation of trades was therefore a bailout of some funds (heavily reliant on algorithmic trading) at the expense of others, and this prevented a stabilizing shift in the market composition of trading strategies.

A similar argument could be made about the effects of the Troubled Asset Relief Program. It has recently been claimed, for instance by Alan Blinder and Mark Zandi, that TARP has been a "substantial success" because it averted a second Great Depression at a cost to taxpayers that is turning out to be much lower than originally feared:
The Troubled Asset Relief Program was controversial from its inception. Both the program’s $700 billion headline price tag and its goal of “bailing out” financial institutions—including some of the same institutions that triggered the panic in the first place—were hard for citizens and legislators to swallow. To this day, many believe the TARP was a costly failure. In fact, TARP has been a substantial success, helping to restore stability to the financial system and to end the freefall in  housing and auto markets. Its ultimate cost to taxpayers will be a small fraction of the headline $700 billion figure: A number below $100 billion seems more likely to us, with the bank bailout component probably turning a profit.
Yves Smith is unpersuaded by such figures, which she attributes to "back door, less visible bailouts, super cheap interest rates, [and] regulatory forbearance." But even if one were to take at face value the Blinder-Zandi estimates of the revenue consequences of TARP, there remain potentially harmful effects on the size composition of firms and the distribution of financial practices. The institutions that were bailed out made directional bets that either failed directly, or were with counterparties that would have failed in the absence of government support. Smaller institutions making such mistakes were allowed to go under, while larger ones were bailed out. Quite apart from the unfairness of this, the policy could be severely damaging to the stability of the system over the medium run.

This point was made a couple of months ago in a speech by Richard Fisher of the Dallas Fed (and expanded upon by Tyler Durden and Ashwin Parameswaran shortly thereafter):
Big banks that took on high risks and generated unsustainable losses received a public benefit... As a result, more conservative banks were denied the market share that would have been theirs if mismanaged big banks had been allowed to go out of business. In essence, conservative banks faced publicly backed competition...
The system has become slanted not only toward bigness but also high risk... Clearly, if the central bank and regulators view any losses to big bank creditors as systemically disruptive, big bank debt will effectively reign on high in the capital structure. Big banks would love leverage even more, making regulatory attempts to mandate lower leverage in boom times all the more difficult. In this manner, high risk taking by big banks has been rewarded, and conservatism at smaller institutions has been penalized...

It is not difficult to see where this dynamic leads—to more pronounced financial cycles and repeated crises.
Fisher goes on to argue for strict limits on the size of individual financial institutions relative to that of the industry. So does Nouriel Roubini:
Greed has to be controlled by fear of loss, which derives from knowledge that the reckless institutions and agents will not be bailed out. The systematic bailouts of the latest crisis – however necessary to avoid a global meltdown – worsened this moral-hazard problem. Not only were “too big to fail” financial institutions bailed out, but the distortion has become worse as these institutions have become – via financial-sector consolidation – even bigger. If an institution is too big to fail, it is too big and should be broken up.
But were the bailouts really necessary to avoid a global meltdown? Blinder and Zandi argue that the alternative would have been completely catastrophic:
The financial policy responses were especially important. In the scenario without them, but including the fiscal stimulus, the recession would only now be winding down, a full year after the downturn’s actual end... The differences between the baseline and the scenario based on no financial policy responses... represent our estimates of the combined effects of the various policy efforts to stabilize the financial system — and they are very large. By 2011, real GDP is almost $800 billion (6%) higher because of the policies, and the unemployment rate is almost 3 percentage points lower. By the second quarter of 2011 — when the difference between the baseline and this scenario is at its largest — the financial-rescue policies are credited with saving almost 5 million jobs.
Here the baseline is the set of policies actually pursued (including fiscal and financial policies) and it is being compared to the case of "no financial policy responses." However, as Yves Smith and Barry Ritholtz have pointed out, this is an absurd counterfactual. Barry argues that  the proper point of comparison ought to be what should have been done, which in his view is the following:
One by one, we should have put each insolvent bank into receivership, cleaned up the balance [sheet], sold off the bad debts for 15-50 cents on the dollar, fired the management, wiped out the shareholders, and spun out the proceeds, with the bondholders taking the haircut, and the taxpayers on the hook for precisely zero dollars. Citi, Bank of America, Wamu, Wachovia, Countrywide, Lehman, Merrill, Morgan, etc. all of them should have been handled this way.

The net result of this would have been more turmoil, lower stock prices, and a sharper, but much shorter economic contraction. It would have been painful and disruptive — like emergency surgery is — but its better than an exploded appendix.

And today, we would have a much healthier economy.
Whether or not one agrees with this assessment, Yves and Barry are surely correct in arguing that counterfactuals other than the hands-off policy ought to be considered before one accepts the emerging conventional wisdom that the authorities handled the crisis well.

What the broken trades trades of May 6 and the bailouts of 2008 have in common is that they were both impulsive decisions, designed to deal with immediate concerns, and executed with little regard for their long term consequences. As I said in an earlier post, these decisions were made under enormous pressure with little time for reflection, and mistakes made in such circumstances would ordinarily be forgivable. But to insist that the best available course of action was taken, and that any alternative would have had devastating economic costs, is neither credible nor wise.

---

Update (8/20). The comments on this post by Andy Harless, David Merkel and Economics of Contempt are worth reading. Andy thinks that I am attacking a straw man and that the Ritholtz proposal was not even feasible, let alone optimal. David questions the use by Blinder and Zandi of a forecasting model to generate counterfactuals, given the appalling performance of such models in predicting the crisis in the first place. And here's Economics of Contempt:
"Smaller institutions making such mistakes were allowed to go under, while larger ones were bailed out."

I have to take issue with that statement. Yes, large banks were bailed out, but hundreds upon hundreds of small banks were bailed out too! Fully 836 financial institutions were bailed out using TARP money, the vast majority of which were small banks. While it's true that most of the bank failures have been small banks, there were large banks that were allowed to fail too -- e.g., Lehman, WaMu.

As for Barry Ritholtz's alternative scenario, there are too many basic factual errors to take it seriously. For one thing, receivership wasn't available to non-commercial banks. It was also legally impossible to separate AIGFP from AIG, since AIG had unconditionally guaranteed all of AIGFP's liabilities, and all their trades included cross-default provisions. A lot of the actions Barry proposes were literally impossible to do. It's simply not a credible list, and I'm surprised that you would fall for it.

Finally, I think it's unfair to say that the bailouts created bad precedents without also mentioning that we now have a resolution authority for non-bank financial institutions. How are decisions that were made without the availability of a resolution authority proper precedents for decisions that will be made with a resolution authority? You would never say that decisions made in pre-FDIC bank failures are proper precedents for post-FDIC bank failures, would you?
These are all good points. I probably should have been a bit more skeptical when discussing the Ritholtz scenario. I did not intend to endorse his proposal, only to suggest that we need to think through a broad range of counterfactuals in evaluating the response to the crisis. But of course these counterfactuals must be feasible given the tools available at the time, and his point about the resolution authority is well taken.

What bothered me most about Geithner's congressional testimony was his claim that "the government’s strategy regarding AIG was essential to our success in confronting the worst financial crisis in generations." That is, in averting an economic calamity, there was no alternative to the government making massive payouts on privately negotiated speculative bets. This is a bold claim with very serious consequences and ought not to be made lightly. In particular, the consequences of alternative scenarios has to be traced out with some seriousness.

Wednesday, August 18, 2010

On Teachable Moments and Non-Conversations

The latest in the series of consistently interesting dialogues between Glenn Loury and John McWhorter has been posted:


Among the themes explored in this conversation is the manner in which a steady stream of "race-related events" are turned in a media frenzy into "teachable moments" and calls for a "national conversation on race." As examples they cite the Gates arrest a year ago, the recent vilification and vindication of Shirley Sherrod, Harry Reid's characterization of Obama's dialect, Hilary Clinton's comments on the King legacy, and Jim Clyburn's response to these comments -- all part of a long list of racially charged incidents that briefly occupy the national spotlight from time to time.
Glenn, for one, is terribly weary of the "melodramatic dance that we do about race and racial etiquette in this society" in the wake of such events:
I’m tired of the national non-conversation on race... we’re not talking about real things… we’re mired in a kind of superficial morality of expressive convention… what can and cannot properly be said by right thinking people… fingerpointing… grandstanding… moralizing… real genuine moral engagement with serious problems in our society… gets short shrift while everyone is posturing… checking the scorecard to see what the exactly correct way of expressing something is… I’m just so weary of this.
John agrees that the "posturing" and "witch-hunting" is little more than "a theatrical production that we are taught to pretend is an engagement with something substantial."
In contrast to the loud (if brief) responses to these so-called teachable moments, there is almost total silence in the public sphere about the really serious issues with which we need to be grappling. Here's Glenn:
One million African American men under lock and key on any given day… structured, reproduced inequality of a raced nature… violent crime perpetrated by black people often on other black people at enormous scale… children with no prospect to realize their God-given talents or their human potential because the institutions designed to facilitate their development have failed them totally… these are the things that demonstrate that society is not in a post-racial moment, and they turn out to be a lot less about theatrics and a lot more about politics, policy, candor… if we wanted to have a conversation on race we’d have to start with some of the really hard stuff, and I’m afraid it wouldn’t be as easy as hunting out politically incorrect racists and then calling them what they are.
Ta-Nehisi Coates has also recently addressed the issue of national non-conversations, arguing that we learn nothing because we aspire not to know:
I keep hearing people bantering about this notion of a national conversation on race, and I have finally figured out why it rankles so... Expecting an American conversation on race in this country, is like expecting financial advice from someone who prefers to not check their bank balance. It's not that the answers, themselves, are pre-ordained, its that we are more interested in  answers than questions, in verdicts than evidence...

Put bluntly, this is a country too ignorant of itself to grapple with race in any serious way. The very nomenclature -- "conversation on race" -- betrays the unseriousness of the thing by communicating the sense that race can be boxed from the broader American narrative, that you can somehow talk about Thomas Jefferson without Sally Hemmings; that you can discuss Andrew Jackson without discussing his betrayal of the black artillerymen who fought at the Battle of New Orleans; that you can discuss the suffrage without Sojourner Truth, Ida B. Wells or Frederick Douglass; that you can discuss temperance without understanding the support of the Klan; that you can discuss the path to statehood in Florida without discussing Fort Gadsen; that you can talk Texas without understanding cotton, and so on.

It's not so much that we don't know -- it's that we aspire to not know. The ignorance of the African-American thread in the broader American quilt -- the essential nature of that thread -- is willful... Race isn't a "distraction" from Obama's agenda -- it's the compromised, unsure ground upon which this country walks everyday...

Talk is overrated. There can be no talk with people who've conditioned themselves out of listening. This is the country we've made. This is the country we deserve.
Coates returns to this theme in a follow-up post prompted by Jim Webb's column on diversity:
I think the fact that we don't really have the implements to carry out this much ballyhoed conversation were really brought home by Jim Webb's piece "The Myth Of White Privilege"... The title, itself, is a device meant to drive conservatives to cheering, liberals to howling, and the whole of them all to page-clicking and reading, In short, it proceeds not from any desire to conversate, as we say, but to provoke strong emotion, and hopefully, page-views... I hate unthinking equivalence, but its quite clear to me that liberals and conservatives both have prominent camps that enjoy yelling.

But its still worth teasing out the intentions and the argument. The questions, themselves, are serious and worthy ones: What is "white privilege" to those who are white and poor, seemingly in perpetuity? Does Affirmative Action exist to promote diversity or historical redress? Is it both? If so, why? Who should be on the receiving end of such redress? Do immigrants from the Caribbean and Africa count? How do Native Americans fit in? What does it mean to have Affirmative Action for white women, many of whom will in turn marry white men?

How do we, specifically, define Affirmative Action? Is it any effort at diversity by anyone, anywhere? Do the questions I listed change depending on the venue? When I was hired, surely the Atlantic relished the idea of adding an African-American to their masthead. Was that Affirmative Action? If so, was it different than what happens, say, at Harvard? Was it bad?

How much does Affirmative Action actually affect white workers? How much discrimination are they actually suffering? In what spheres is this discrimination most prevalent? Are poor whites actually losing out to "people of color?" Do we have any stats on how many people have been affected by Affirmative Action? How broad is its impact?

I'm not really interested in answering any of these questions here and now, so much as I'm interested in asserting their validity, and asserting that they will always be ill-served by an 800 word op-ed with an inflammatory title. My sense is that there are answers to all of these queries. But I don't think we much care to have them. Jim Webb's piece, most regrettably, followed in the tradition of Henry Louis Gates' column on reparations, in that it is a sign post, a line of demarcation. An exclamation point, as opposed to a question mark.

The "conversation around race" is, itself, a kind of tribalism, wherein you look for ways to justify -- instead of interrogate -- your most elemental feelings.
Loury and McWhorter also discuss the Webb column at some length in their dialogue, and also consider the questions posed there to be serious and worthy. In addition, they feel that the column is indicative of a major shift in the manner of public expression regarding race. Here's Glenn again:
I think that the whole regime of genuflection at the altar of correct racial expression is on the verge of collapse… I sense in the air around me here, in the kinds of things that people are saying… Obama’s ascendancy … has contributed to that… as Obama’s success has made it easier for people to breach the etiquette of racial expression, the conversation is going to get a little rougher… people are going to let go of stuff that they’ve been holding on to for a while.
John puts it as follows: "there’s a sea change coming... we can predict, when people start letting it out… a lot of it is going to be pent up… some of it is going to come out infelicitously phrased." 
The conversation that Coates has given up on is happening already, and he is at the heart of it (as are Loury and McWhorter). But it is being drowned out by the shrillest voices, and I fear that the yelling will be ramped up for a while longer before it eventually subsides.

---

Sara Mayeux's response to the Webb column is also worth reading; she argues that the inflammatory title may well have been chosen by an "overzealous copyeditor." My own thoughts on the Gates arrest may be found here, and some reflections on an extraordinary essay by Ralph Ellison (that demonstrates brilliantly the essential nature of the "African American thread in the broader American quilt") here.

Glenn and I co-taught a course on Group Inequality at the Universidad de Los Andes in July, and were interviewed by Dinero while in Bogotá. (The interview was conducted in English, and translated for publication.)

---

Update (8/18). Maxine Udall has a characteristically thoughtful follow-up.