Sunday, November 29, 2009

Maturity Transformation and Liquidity Crises

William Dudley's keynote address at a recent CEPS symposium on the financial system is worth reading in full. What I found especially interesting were the following remarks on structural sources of instability:
The risks of liquidity crises are also exacerbated by some structural sources of instability in the financial system. Some of these sources are endemic to the nature of the financial intermediation process and banking. Others are more specific to the idiosyncratic features of our particular system. Both types deserve attention because they tend to amplify the pressures that lead to liquidity runs.

Turning first to the more inherent sources of instability, there are at least two that are worthy of mention. The first instability stems from the fact that most financial firms engage in maturity transformation — the maturity of their assets is longer than the maturity of their liabilities. The need for maturity transformation arises from the fact that the preferred habitat of borrowers tends toward longer-term maturities used to finance long-lived assets such as a house or a manufacturing plant, compared with the preferred habitat of investors, who generally have a preference to be able to access their funds quickly. Financial intermediaries act to span these preferences, earning profits by engaging in maturity transformation — borrowing shorter-term in order to finance longer-term lending.
If a firm engages in maturity transformation so that its assets mature more slowly than its liabilities, it does not have the option of simply allowing its assets to mature when funding dries up. If the liabilities cannot be rolled over, liquidity buffers will soon be weakened. Maturity transformation means that if funding is not forthcoming, the firm will have to sell assets. Although this is easy if the assets are high-quality and liquid, it is hard if the assets are lower quality. In that case, the forced asset sales are likely to lead to losses, which deplete capital and raise concerns about insolvency.
The second inherent source of instability stems from the fact that firms are typically worth much more as going concerns than in liquidation. This loss of value in liquidation helps to explain why liquidity crises can happen so suddenly. Initially, no one is worried about liquidation. The firm is well understood to be solvent... But once counterparties start to worry about liquidation, the probability distribution can shift very quickly toward the insolvency line... because the liquidation value is lower than the firm’s value as a going concern...
These sources of instability create the risk of a cascade... Once the firm’s viability is in question and it is does not have access to an insured deposit funding base, the next stop is often a full-scale liquidity crisis that often cannot be stopped without massive government intervention.
As Dudley notes, maturity transformation is "endemic to the nature of the financial intermediation process and banking." But non-financial firms (and the United States Treasury) can also engage in maturity transformation by borrowing short relative to their expected revenue streams. This is what Hyman Minsky called speculative (as opposed to hedge) financing. One of Minsky's key insights was that over a period of stable growth with relatively tranquil financial markets, there is a progressive shift away from hedge and towards speculative financing:
The natural starting place for analyzing the relation between debt and income is to take an economy with a cyclical past that is now doing well. The inherited debt reflects the history of the economy, which includes a period in the not too distant past in which the economy did not do well. Acceptable liability structures are based upon some margin of safety, so that expected cash flows, even in periods when the economy is not doing well, will cover contractual debt payments. As the period over which the economy does well lengthens, two things become evident in board rooms. Existing debts are easily validated and units that were heavily in debt prospered; it paid to lever. After the event it becomes apparent that the margins of safety built into debt structures were too great. As a result, over a period in which the economy does well, views about acceptable debt structure change. In the deal making that goes on between banks, investment bankers, and businessmen, the acceptable amount of debt to use in financing various types of activity and positions increases. (Minsky 1982, p.65)
Short-term financing of long-lived capital assets is lucrative as long as debts can be rolled over easily at relatively stable interest rates. But this induces more firms to engage in speculative rather than hedge financing, making the demand for refinancing increasingly inelastic. The eventual result is a crisis of liquidity and a shift back towards hedge financing.
Many economists (myself included) have tried to construct formal models of the process described by Minsky, but with limited success to date. This may be a good time to give it another shot. 

Thursday, November 26, 2009

On Prediction Markets for Climate Change

There's an interesting debate in progress between Nate Silver and Matt Yglesias on the merits of introducing prediction markets for climate change. Nate is enthusiastic about Robin Hanson's proposal that such markets be developed, Matt is concerned about manipulation of prices by coal and oil interests, and Nate thinks that these concerns are a bit overblown and could be overcome by creating markets that have broad participation and high levels of liquidity.
Nate's argument is roughly as follows: the broader the participation and the greater the volume of trade, the more expensive it will be for an individual or organization to consistently manipulate prices over a period of months or years. If this argument is correct, then markets with limited participation and low volume (such as the Iowa Electronic Markets) should be less efficient at aggregating information than markets with relatively broad participation and much higher volume (such as Intrade). The logic of this argument is so compelling that I was once certain it must be true. But after watching these two markets closely during the 2008 election season, I became convinced that it was IEM rather than Intrade that was sending the more reliable signals, and for some very interesting and subtle reasons.
First of all, let's think for a minute about how one might determine which of two markets is aggregating information more efficiently. We can't just look at events that occurred and examine which of the two markets assigned such events greater probability, because low probability events do indeed sometimes occur. If we had a very large number of events (as in weather forecasting) then one could construct calibration curves to compare markets, but the number of contracts on IEM is very small and this option is not available. So what do we do?
Fortunately, there is a reliable method for comparing the efficiency of the two markets, by looking for and exploiting cross-market arbitrage opportunities. Here's how it works. Open an account in each market with the same level of initial investment. There is a limit of $500 on initial account balances at IEM, so let's take this as our initial investment also at Intrade. Next, look for arbitrage opportunities: differences in prices for the same asset across markets that are large enough for you to make a certain profit, net of trading fees (these are zero on IEM but not on Intrade). Such opportunities do arise, and sometimes last for hours or even days: here's an example. Act on these opportunities, by selling where the price is high and buying where it is low. When prices in the two markets converge, reverse these trades: buy where you initially sold and sell where you initially bought. You will not make much money doing this, since the price differences in general will be small. But what you will do is transfer funds across accounts without making a loss.
How does this help in answering the question of which market is efficient? After a few weeks or months have passed, your overall balance will have grown slightly, but will now be unevenly distributed across markets. The market in which you have made more money is the one that is less efficient. This is because on average, prices in the less efficient market will move towards those in the more efficient one, and when you reverse your arbitrage position, the profit you will make will be concentrated in the market in which the price has moved most.
Let me state for the record that I did not, in fact, carry out this experiment although I think it would be a good (and probably publishable) research project. But I did try to see informally which market was better predicting future prices in the other, and came to the conclusion that it was IEM. This surprised me, and I started to wonder about the reasons why a small, illiquid market with severe restrictions on participation and account balances could be more efficient.
There are two possible reasons. First, Intrade was highly visible in the news media, and changes in prices were regularly reported on blogs across the political spectrum. A fall in the price of a contract could signal weakness in a campaign, generate pessimism about its viability, and result in a collapse in fundraising. Propping up the price during a difficult period therefore made a lot of sense, and could pay for itself several times over with its impact on donations. Dollar for dollar, it was probably a much better investment than television advertising in prime time. I'm not suggesting that the campaigns themselves did or encouraged this, but it does seem likely that some well-financed supporters took it upon themselves to help out in this way.
The second reason is more interesting. The extent of participation and the volume of trade in a market are not determined simply by the market design; they also depend on the availability of profit opportunities, which itself depends in part on the extent of attempted manipulation. There is an active user's forum on Intrade, and it was clear at the time that a small, smart group of traders were on the lookout for mispriced assets, well aware that such mispricing could arise out of political enthusiasm (as in the nominee contract for Ron Paul) or through active manipulation (as in the Obama and McCain contracts discussed by Nate here).
In other words, the breadth of participation and the volume of trade will be higher when market manipulation is suspected than when it is not. If the climate change futures market is assumed to be efficient, it will probably attract fewer traders and lower volumes of investment. So Nate's solution - the design of a market with high participation and liquidity in order to generate efficiency - contains at its heart a paradox. It is inefficiency that will generate high participation and liquidity should such a market come into existence.
I do believe that the introduction of prediction markets for climate change is a good idea. But I would like to see similar contracts offered across multiple markets, including at least one like the IEM in which participation is limited with respect to both membership and initial balance. This will allow us to carry out an ongoing evaluation of the reliability of market signals, as well as the effectiveness of different market designs.

---

Update (12/12): Thanks to Paul Hewitt for an extended discussion of this post, and to Chris Masse for linking both here and to Paul's commentary.

Tuesday, November 24, 2009

On Buiter, Goodwin, and Nonlinear Dynamics

A few months ago, Willem Buiter published a scathing attack on modern macroeconomics in the Financial Times. While a lot of attention has been paid to the column's sharp tone and rhetorical flourishes, it also contains some specific and quite constructive comments about economic theory that deserve a close reading. One of these has to do with the limitations of linearity assumptions in models of economic dynamics:
When you linearize a model, and shock it with additive random disturbances, an unfortunate by-product is that the resulting linearised model behaves either in a very strongly stabilising fashion or in a relentlessly explosive manner.  There is no ‘bounded instability’ in such models.  The dynamic stochastic general equilibrium (DSGE) crowd saw that the economy had not exploded without bound in the past, and concluded from this that it made sense to rule out, in the linearized model, the explosive solution trajectories.  What they were left with was something that, following an exogenous  random disturbance, would return to the deterministic steady state pretty smartly.  No L-shaped recessions.  No processes of cumulative causation and bounded but persistent decline or expansion.  Just nice V-shaped recessions.
Buiter is objecting here to a vision of the economy as a stable, self-correcting system in which fluctuations arise only in response to exogneous shocks or impulses. This has come to be called the Frisch-Slutsky approach to business cycles, and its intellectual origins date back to a memorable metaphor introduced by Knut Wicksell more than a century ago: "If you hit a wooden rocking horse with a club, the movement of the horse will be very different to that of the club" (translated and quoted in Frisch 1933). The key idea here is that irregular, erratic impulses can be transformed into fairly regular oscillations by the structure of the economy. This insight can be captured using linear models, but only if the oscillations are damped - in the absence of further shocks, there is convergence to a stable steady state. This is true no matter how large the initial impulse happens to be, because local and global stability are equivalent in linear models.

A very different approach to business cycles views fluctuations as being caused by the local instability of steady states, which leads initially to cumulative divergence away from balanced growth. Nonlinearities are then required to ensure that trajectories remain bounded. Shocks to the economy can make trajectories more erratic and unpredictable, but are not required to account for persistent fluctuations. An energetic and  life-long proponent of this approach to business cycles was Richard Goodwin, who also produced one of the earliest such models in economics (Econometrica, 1951). Most of the literature in this vein has used aggregate investment functions and would not be considered properly microfounded by contemporary standards (see, for instance, Chang and Smyth 1971Varian 1979, or Foley 1987). But endogenous bounded fluctuations can also arise in neoclassical models with overlapping generations (Benhabib and Day 1982Grandmont 1985).

The advantage of a nonlinear approach is that it can accommodate a very broad range of phenomena. Locally stable steady states need not be globally stable, so an economy that is self-correcting in the face of small shocks may experience instability and crisis when hit by a large shock. This is Axel Leijonhufvud's corridor hypothesis, which its author has discussed in a recent column. Nonlinear models are also required to capture Hyman Minsky's financial instability hypothesis, which argues that periods of stable growth give rise to underlying behavioral changes that eventually destabilize the system. Such hypotheses cannot possibly be explored formally using linear models.

This, I think, is the point that Buiter was trying to make. It is the same point made by Goodwin in his 1951 Econometrica paper, which begins as follows:
Almost without exception economists have entertained the hypothesis of linear structural relations as a basis for cycle theory. As such it is an oversimplified special case and, for this reason, is the easiest to handle, the most readily available. Yet it is not well adapted for directing attention to the basic elements in oscillations - for these we must turn to nonlinear types. With them we are enabled to analyze a much wider range of phenomena, and in a manner at once more advanced and more elementary. 
By dropping the highly restrictive assumptions of linearity we neatly escape the rather embarrassing special conclusions which follow. Thus, whether we are dealing with difference or differential equations, so long as they are linear, they either explode or die away with the consequent disappearance of the cycle or the society. One may hope to avoid this unpleasant dilemma by choosing that case (as with the frictionless pendulum) just in between. Such a way out is helpful in the classroom, but it is nothing more than a mathematical abstraction. Therefore, economists will be led, as natural scientists have been led, to seek in nonlinearities an explanation of the maintenance of oscillation. Advice to this effect, given by Professor Le Corbeiller in one of the earliest issues of this journal, has gone largely unheeded.
And sixty years later, it remains largely unheeded.

---
Update (11/27): Thanks to Mark Thoma for reposting this.
Update (11/28): Mark has an interesting follow up post on Varian (1979).
Update (11/29): Barkley Rosser continues the conversation.

Monday, November 23, 2009

A Further Comment on the Term Structure of Interest Rates

In my last post, I raised some questions about Paul Krugman's view that the government should not be deterred from implementing job creation policies by the fear of raising long term interest rates:
What Krugman seems to be advocating is the following: if long term rates should start to rise, the Treasury should finance the deficit by issuing more short-term (and less long-term) debt, thereby flattening the yield curve and holding long term rates low. This would prevent capital losses for carry traders (although it would lower the continuing profitability of the carry trade if short rates rise).
In effect, Krugman is arguing that the Treasury should itself act like a carry trader: rolling over short term debt to finance a long-term structural deficit. But why is this not being done already? Take a look at the current Treasury yield curve... What is currently preventing the Treasury from borrowing at much more attractive short rates to finance the deficit? Is it is a fear of driving up short rates? And if so, won't the same concerns be in place if long term rates start to rise?
From today's New York Times comes a partial answer:
Treasury officials now face a trifecta of headaches: a mountain of new debt, a balloon of short-term borrowings that come due in the months ahead, and interest rates that are sure to climb back to normal as soon as the Federal Reserve decides that the emergency has passed.
Even as Treasury officials are racing to lock in today’s low rates by exchanging short-term borrowings for long-term bonds, the government faces a payment shock similar to those that sent legions of overstretched homeowners into default on their mortgages.
So the Treasury is currently swapping short term obligations for long term ones. Given their reasons for doing this, I don't see that the solution proposed by Krugman - that "the government issue more short-term debt" if long term rates start to rise - is going to be feasible. On the other hand, his suggestion that the Fed buy more long term bonds may still be an option.
Update: Both Dean Baker and Brad DeLong are unhappy with the Times column I linked to above, and probably for good reason. But as long as it accurately describes the current behavior of the Treasury, my argument still stands. Further increases deficit spending may be a good idea but they have to be financed in some way, and it's worth thinking about the implications of different maturity dates for new issues.

Saturday, November 21, 2009

On Carry Traders and Long Term Interest Rates

Tyler Cowan thinks that this post by Paul Krugman on long term interest rates and a follow up by Brad DeLong are critically important and "two of the best recent economics blog posts, in some time".
Krugman's post deals with the question of why some economists in the administration are concerned that further increases in deficit financing could cause long term rates to spike:
Well, what I hear is that officials don’t trust the demand for long-term government debt, because they see it as driven by a “carry trade”: financial players borrowing cheap money short-term, and using it to buy long-term bonds. They fear that the whole thing could evaporate if long-term rates start to rise, imposing capital losses on the people doing the carry trade; this could, they believe, drive rates way up, even though this possibility doesn’t seem to be priced in by the market.

What’s wrong with this picture?

First of all, what would things look like if the debt situation were perfectly OK? The answer, it seems to me, is that it would look just like what we’re seeing.

Bear in mind that the whole problem right now is that the private sector is hurting, it’s spooked, and it’s looking for safety. So it’s piling into “cash”, which really means short-term debt. (Treasury bill rates briefly went negative yesterday). Meanwhile, the public sector is sustaining demand with deficit spending, financed by long-term debt. So someone has to be bridging the gap between the short-term assets the public wants to hold and the long-term debt the government wants to issue; call it a carry trade if you like, but it’s a normal and necessary thing.

Now, you could and should be worried if this thing looked like a great bubble — if long-term rates looked unreasonably low given the fundamentals. But do they? Long rates fluctuated between 4.5 and 5 percent in the mid-2000s, when the economy was driven by an unsustainable housing boom. Now we face the prospect of a prolonged period of near-zero short-term rates — I don’t see any reason for the Fed funds rate to rise for at least a year, and probably two — which should mean substantially lower long rates even if you expect yields eventually to rise back to 2005 levels. And if we’re facing a Japanese-type lost decade, which seems all too possible, long rates are in fact still unreasonably high.

Still, what about the possibility of a squeeze, in which rising rates for whatever reason produce a vicious circle of collapsing balance sheets among the carry traders, higher rates, and so on? Well, we’ve seen enough of that sort of thing not to dismiss the possibility. But if it does happen, it’s a financial system problem — not a deficit problem. It would basically be saying not that the government is borrowing too much, but that the people conveying funds from savers, who want short-term assets, to the government, which borrows long, are undercapitalized.
And the remedy should be financial, not fiscal. Have the Fed buy more long-term debt; or let the government issue more short-term debt. Whatever you do, don’t undermine recovery by calling off jobs creation.
What Krugman seems to be advocating is the following: if long term rates should start to rise, the Treasury should finance the deficit by issuing more short-term (and less long-term) debt, thereby flattening the yield curve and holding long term rates low. This would prevent capital losses for carry traders (although it would lower the continuing profitability of the carry trade if short rates rise).
In effect, Krugman is arguing that the Treasury should itself act like a carry trader: rolling over short term debt to finance a long-term structural deficit. But why is this not being done already? Take a look at the current Treasury yield curve:



What is currently preventing the Treasury from borrowing at much more attractive short rates to finance the deficit? Is it is a fear of driving up short rates? And if so, won't the same concerns be in place if long term rates start to rise?

Friday, November 20, 2009

Econometric Society Fellows: A Tale of Two Duncans

The Fellows of the Econometric Society are an elite group of economists, numbering less than 500, nominated and elected by their peers:
To be eligible for nomination as a Fellow, a person must have published original contributions to economic theory or to such statistical, mathematical, or accounting analyses as have a definite bearing on problems in economic theory... Candidates elected to Fellowship are those with a total number of check marks at least equal to 30 percent of the number of mail ballots submitted by Fellows. Over the past decade about 15 candidates per year have been elected as new Fellows.
Among the most recently elected fellows is 84 year old R. Duncan Luce, who by most accounts should have been elected decades ago. Jeff Ely (himself a newly elected fellow) explains why it took so long:
The problem is that there are many economists and its costly to investigate each one to see if they pass the bar. So you pick a shortlist of candidates who are contenders and you investigate those.  Some pass, some don’t.  Now, the next problem is that there are many fellows and many non-fellows and its hard to keep track of exactly who is in and who is out.  And again it’s costly to go and check every vita to find out who has not been admitted yet.
So when you pick your shortlist, you are including only economists who you think are not already fellows.  Someone like Duncan Luce, who certainly should have been elected 30 years ago most likely was elected 30 years ago so you would never consider putting him on your shortlist.
Indeed, the simple rule of thumb you would use is to focus on young people for your shortlist.  Younger economists are more likely to be both good enough and not already fellows.
This makes sense. But the proliferation of blogs makes the costs of identifying individuals who have been unfairly overlooked much lower, because the task can be decentralized. Anyone anywhere in the world can make a case and hope that some existing fellows take notice.
In this spirit, let me make a case for Duncan Foley. While still a graduate student in the 1960's, Foley introduced what is now a standard concept of fairness into general equilibrium theory. Here's what Andrew Postlewaite wrote in 1988 about this innovation:
Nearly 20 years ago Duncan Foley introduced a notion of fairness which was completely consistent with standard economic models. This notion was that of envy, or more precisely, lack of envy. An economic outcome was said to be envy-free if no one preferred another's final bundle of goods and services to his or her own bundle. The concept is both compelling and easily accommodated in standard economic models. It is attractive on several grounds. First, it is ordinal - it does not depend upon the particular utility function representing one's preferences, and thus avoids all the problems associated with interpersonal comparison of utilities. Second, the concept relies on precisely the same economic data necessary to determine the efficiency or nonefficiency of the outcomes associated with a particular policy. After Foley introduced this concept into modern economics a number of economists, including Pazner, Schmeidler, Varian, Vind, and Yaari, analyzed and extended the concept.
It is now more than 40 years since Foley developed these ideas. For the concept of envy-free allocations alone he deserves to be elected, but this is just one of several notable contributions. His papers on the core of an economy with public goods (Econometrica 1970), equilibrium with costly marketing (Journal of Economic Theory 1970), portfolio choice and growth (American Economic Review 1970), asset management with trading uncertainty (Review of Economic Studies 1975), and asset equilibrium in macroeconomic models (Journal of Political Economy 1975) all continue to be widely cited. He has made influential contributions to Marxian economics (Journal of Economic Theory 1982) and used ideas from classical thermodynamics to understand price dispersion (Journal of Economic Theory 1994). And he has written several books, including a pioneering effort in 1971 with Miguel Sidrauski on monetary and fiscal policy in a growing economy.

It is my hope that someday soon he will be nominated and elected a new fellow of the Econometric Society, an honor he so richly deserves.

Thursday, November 19, 2009

On Rational Expectations and Equilibrium Paths

Via Mark Thoma, I recently came across this post by Paul De Grauwe:
There is a general perception today that the financial crisis came about as a result of inefficiencies in the financial markets and economic actors’ poor understanding of the nature of risks. Yet mainstream macroeconomic models, as exemplified by the dynamic stochastic general equilibrium (DSGE) models, are populated by agents who are maximising their utilities in an intertemporal framework using all available information including the structure of the model... In other words, agents in these models have incredible cognitive abilities. They are able to understand the complexities of the world, and they can figure out the probability distributions of all the shocks that can hit the economy. These are extraordinary assumptions that leave the outside world perplexed about what macroeconomists have been doing during the last decades.
De Grauwe goes on to argue that rational expectations models are "intellectual heirs of central planning" and makes a case for a "bottom-up" or agent-based approach to macroeconomics.
The rational expectations hypothesis is actually even more demanding than De Grauwe's post suggests, since it is an equilibrium assumption rather than just a behavioral hypothesis. It therefore requires not only that agents have "incredible cognitive abilities" but also that this fact is common knowledge among them, and that they are able to coordinate their behavior in order to jointly traverse an equilibrium path. This point has been made many times; for a particularly clear statement of it see the chapter by Mario Henrique Simonsen in The Economy as an Evolving Complex System

Equilibrium analysis can be very useful in economics provided that the conclusions derived from it are robust to minor changes in specification. In order for this to be the case, it is important that equilibrium paths are stable with respect to plausible disequilibrium dynamics. As Richard Goodwin once said, an unstable equilibrium is "the one place the system will never be found." But while equilibrium dynamics are commonplace in economics now, the stability of equilibrium paths with respect to disequilibrium dynamics is seldom considered worth exploring.