Sunday, February 20, 2011

Market Ecology

The erudite and very readable RT Leuchtkafer has posted yet another comment for the Securities and Exchange Commission to digest. This one was prompted by a paper by Andrei Kirilenko, Albert Kyle, Mehrdad Samadi and Tugkan Tuzun that provides a fascinating glimpse into the kinds of trading strategies that are common in asset markets today and the manner in which they interact to determine the dynamics of asset prices.

As I have argued on a couple of earlier occasions, the stability of a market depends on the composition of trading strategies, which in turn evolves over time under pressure of differential performance. Since performance itself depends on market stability, and destabilizing strategies prosper most when they are rare, this process can give rise to switching regimes: the market alternates between periods of stability and instability, giving rise to empirical patterns such as fat tails and clustered volatility in asset returns.

But the underlying strategies that are at the heart of this evolutionary process are generally unobservable. Since traders have no incentive to reveal successful strategies, these can only be inferred if individual orders can be traced to specific accounts.

This is what Kirilenko and his co-authors have been able to do, on the basis of "audit-trail, transaction-level data for all regular transactions in the June 2010 E-mini S&P 500 futures contract (E-mini) during May 3-6, 2010 between 8:30 a.m. CT and 3:15 p.m. CT." While their primary concern is with the flash crash that materialized on the afternoon of the 6th, their analysis also sheds light on the composition and behavior of strategies over the period that led up to this event. Their analysis accordingly provides broader insight into the ecology of financial markets.

The authors classify accounts into six categories based on patterns exhibited in their trading behavior, such as horizon length, order size, and the willingness to accumulate significant net positions.  The categories are High Frequency Traders (HFTs), Intermediaries, Fundamental Buyers, Fundamental Sellers, Opportunistic Traders and Small Traders:
[Different] categories of traders occupy quite distinct, albeit overlapping, positions in the “ecosystem” of a liquid, fully electronic market. HFTs, while very small in number, account for a large share of total transactions and trading volume. Intermediaries leave a market footprint qualitatively similar, but smaller to that of HFTs. Opportunistic Traders at times act like Intermediaries (buying a selling around a given inventory target) and at other times act like Fundamental Traders (accumulating a directional position). Some Fundamental Traders accumulate directional positions by executing many small-size orders, while others execute a few larger-size orders. Fundamental Traders which accumulate net positions by executing just a few orders look like Small Traders, while Fundamental Traders who trade a lot resemble Opportunistic Traders. In fact, it is quite possible that in order not to be taken advantage of by the market, some Fundamental Traders deliberately pursue execution strategies that make them appear as though they are Small or Opportunistic Traders. In contrast, HFTs appear to play a very distinct role in the market and do not disguise their market activity.
Based on this taxonomy, the authors examine the manner in which the strategies vary with respect to trading volume, liquidity provision, directional exposure, and profitability. Although high-frequency traders constitute a minuscule proportion (about one-tenth of one percent) of total accounts, they are responsible for more than a third of aggregate trading volume in this market. They have extremely short trading horizons and maintain low levels of directional exposure. Under normal market conditions they are net providers of liquidity but their desire to avoid significant exposure means that they can become liquidity takers very quickly and on a large scale.

The extent to which different trading strategies provide liquidity to the market is assessed by the authors on the basis of a measure of order aggression. An order is said to be aggressive if it is marketable against a resting order in the limit order book (and is therefore executed immediately.) The resting order with which it is matched is said to be passive:
From a liquidity standpoint, a passive order (either to buy or to sell) has provided visible liquidity to the market and an aggressive order has taken liquidity from the market. Aggressiveness ratio is the ratio of aggressive trade executions to total trade executions... weighted either by the number of transactions or trading volume... HFTs and Intermediaries have aggressiveness ratios of 45.68% and 41.62%, respectively. In contrast, Fundamental Buyers and Sellers have aggressiveness ratios of 64.09% and 61.13%, respectively.
This is consistent with a view that HFTs and Intermediaries generally provide liquidity while Fundamental Traders generally take liquidity. The aggressiveness ratio of High Frequency Traders, however, is higher than what a conventional definition of passive liquidity provision would predict.
Moreover, the aggressiveness ratio of HFTs is not stable over time and can spike in times of market stress as they compete for liquidity with other market participants:
During the Flash Crash, the trading behavior of HFTs, appears to have exacerbated the downward move in prices. High Frequency Traders who initially bought contracts from Fundamental Sellers, proceeded to sell contracts and compete for liquidity with Fundamental Sellers. In addition, HFTs appeared to rapidly buy and [sell] contracts from one another many times, generating a “hot potato” effect before Opportunistic or Fundamental Buyers were attracted by the rapidly falling prices to step in and take these contracts off the market.
To my mind, the most revealing findings in the paper pertain to the profitability of the various strategies, and the ability of some traders to anticipate price movements over very short horizons (emphasis added):
High Frequency Traders effectively predict and react to price changes... [they] are consistently profitable although they never accumulate a large net position. This does not change on May 6 as they appear to have been even more successful despite the market volatility observed on that day... Intermediaries appear to be relatively less profitable than HFTs. During the Flash Crash, Intermediaries also appeared to have incurred significant losses... consistent with the notion that the relatively slower Intermediaries were unable to liquidate their position immediately, and were subsequently run over by the decrease in price...
 

HFTs appear to trade in the same direction as the contemporaneous price and prices of the past five seconds. In other words, they buy... if the immediate prices are rising. However, after about ten seconds, they appear to reverse the direction of their trading... possibly due to their speed advantage or superior ability to predict price changes, HFTs are able to buy right as the prices are about to increase... In marked contrast... Intermediaries buy when the prices are already falling and sell when the prices are already rising...

We consider Intermediaries and HFTs to be very short term investors. They do not hold positions over long periods of time and revert to their target inventory level quickly... HFTs very quickly reduce their inventories by submitting marketable orders. They also aggressively trade when prices are about to change. Over slightly longer time horizons, however, HFTs sometimes act as providers of liquidity. In contrast... unlike HFTs, Intermediaries provide liquidity over very short horizons and rebalance their portfolios over longer horizons.
What appears to have happened during the crash is that the fastest moving market makers with the most effective algorithms for short run price prediction were able to trade ahead of their slower and less effective brethren, imposing significant losses on the latter. In Leuchtkafer's colorful language, this was a case of interdealer panic and market maker fratricide.

But regardless of how the gains or losses were distributed in this instance, the fact remains that an overwhelming share of trading activity is based short-run price forecasts rather than fundamental research. Under these conditions, how can one expect prices to track changes in the fundamental values of the income streams to which the assets give title?

Markets have always been based on a shifting balance between information augmenting and information extracting strategies, but a computational arms race coupled with changes in institutions and regulation seem to have shifted the balance markedly towards the latter. Unless the structure of incentives is altered to favor longer holding periods, I suspect that we shall continue to see major market disruptions and spikes in volatility.

This is not just a matter of academic interest. To the extent that changes in the perceived volatility of stocks gives rise to changes in asset allocations by institutional and retail investors, there will be consequences for the extent and distribution of risk-bearing, and ultimately for rates of job creation and economic growth.

---

Update (2/21). Yves Smith has generously allowed me to crosspost freely on naked capitalism, where this entry has attracted a couple of interesting comments. Here is Peripheral Visionary:
With respect to May 6... the faster algorithms may have caused the damage, but I think they also suffered from it. From the data I reviewed, the traditional market makers had huge numbers of buys at the bottom and huge numbers of sells through the recovery, and so may have come out net positive on the day, while the faster algorithms panicked when the market moved outside the range of expected behavior, and many were shut down, effectively locking in losses. In fact, I suspect that losses for HFT algorithms would have been much larger had not the exchanges canceled so many trades, with many, even most, of the sells at the bottom being algorithm trades.
This was also my initial reaction to the crash, which is why I argued against the cancellation of trades on grounds of stability. The Kirilenko paper does not really settle the question because it focuses only on the E-mini futures market where no trades were broken.

The comment by financial matters is also worth a look; this one links to a CNBC interview with Jim McCaughan in which the exit of institutional and retail investors from the market is documented.

Saturday, February 12, 2011

Belief Heterogeneity

There was an interesting conference at Columbia yesterday (though not nearly as interesting as the momentous events unfolding elsewhere at the time). The theme was "Heterogeneous Expectations and Economic Stability" and this is how the organizers (Ricardo Reis and Mike Woodford) described the goal of the meeting:
Conventional models in both macroeconomics and finance are based on the hypothesis of rational expectations, under which all agents are assumed to have common expectations, corresponding to the probabilities implied by the economist’s model. The adequacy of this familiar hypothesis has been called into question by recent events, however, notably the instability resulting from the boom and bust in real estate prices. The purpose of this conference is to bring together researchers exploring alternative approaches to modeling the dynamics of expectations, with particular attention to applications in macroeconomics and finance. We have sought to bring together proponents of a variety of approaches, who may not frequently engage one another, in the hope of reaching conclusions about which directions are most promising at this time.
And, indeed, the collection of papers presented were methodologically diverse. Although any such classification is bound to be coarse and imperfect, there seem to be four different directions in which research on expectations is proceeding. First, there is the approach of near-rational expectations, in which intertemporal optimization and Bayesian rationality are maintained but allowance is made for heterogeneous prior beliefs. Then there is the behavioral approach, which endows agents with heuristics based on regularities identified in laboratory experiments. Third, there is the evolutionary approach, which allows for a broad range of competing forecasting rules with the population composition shifting over time under pressure of performance differentials. And finally, the empirical approach, which treats expectations as a state variable to be measured using survey or market data and explained just as one would explain output or inflation. Each of these perspectives was on prominent display at the conference.

Regular readers of this blog (if there are any left, given the recent decline in my rate of posting) will know that I am deeply skeptical of the behavioral approach to trading strategies, for the simple reason that behavior in high stakes environments with strong selection pressures driving entry and exit is unlikely to be psychologically typical in the sense of reflecting outcomes of lab experiments with standard subject pools. What might be a common behavioral trait in the population at large could be extremely rare among traders, especially if such traits can be exploited with ease by other market participants. By the same token, behavior that is pathological in the lab could well become widespread in financial markets from time to time. As a result my favored approach to trading strategies in general and forecasting rules in particular is ecological.

Not surprisingly, then, the presentation I found most appealing was that of Blake LeBaron. Blake is a pioneer in the development of agent-based computational models of financial markets, and the paper he presented belonged to this class. A large number of different forecasting strategies, some based on fundamental information and others on technical data analysis, compete with each other and with a traditional buy-and-hold strategy in his model. The resulting trading dynamics give rise to asset price returns that exhibit both moderate levels of short-run momentum as well as mean reversion over longer horizons. Moreover, the long run population of forecasting rules is ecologically diverse, with both passive and active strategies well represented.

During the panel discussion at the end of the conference, Albert Marcet observed that the conference itself was symptomatic of a revolution in economic thought that is currently underway, prompted in large measure by the global financial crisis. If methodologies such as agent-based computational economics start to be published in major journals and attract attention from the most promising graduate students, then there really will be a revolution underway. But I'm not convinced that we're there yet.

One final thought. The conference organizers described the rational expectations hypothesis as one "under which all agents are assumed to have common expectations, corresponding to the probabilities implied by the economist’s model." This is an accurate characterization as far as the contemporary implementation of the hypothesis is concerned, but it is important to note that this is not the hypothesis originally advanced by John Muth in his classic paper. In fact, Muth cited survey data exhibiting "considerable cross-sectional differences of opinion" and was quite explicit in stating that his hypothesis "does not assert... that predictions of entrepreneurs are perfect or that their expectations are all the same.'' In Muth's version of rational expectations, each individual holds beliefs that are model inconsistent, although the distribution of these diverse beliefs is unbiased relative to the data generated by the actions resulting from these expectations. It is a wisdom of crowds argument, rather than one based on individual rationality.

Viewed in this manner, there a sense in which the heterogeneous prior models (with diverse beliefs centered on a model consistent mean) represent both a departure from the rational expectations hypothesis as currently understood, as well as a return to the original rational expectations hypothesis as formulated by Muth. The history of economic thought is full of such rather strange twists and turns.

Wednesday, January 26, 2011

Alison Snow Jones (fine economist)

The world of economics blogs lost a wise and original voice when Alison Snow Jones, better known to her audience as Maxine Udall (girl economist), died suddenly on Monday.

Through her blog, Maxine (as I will always remember her) gave us a glimpse of a personality that was both deeply serious about the thorny issues confronting us as a society and wonderfully playful in the manner in which she addressed them. She managed to weave personal narratives into economic arguments without ever appearing to be self-absorbed. I never met her but we exchanged occasional comments on posts and linked to each others' writing from time to time.  Sometimes one sees more at a distance than is visible at close range.

Steve Waldman has done us all a great service by collecting together in a single moving post a series of excerpts from her most thoughtful writing.

She will be missed by many.

---

Update (1/30). Will Kranz writes in (posted with permission):
Thanks for your post. I knew Alison through a family association. We ate, drank, and laughed together for close to 30 years. I know nothing about econometrics, but enjoyed her company. In addition to economics, she was interested in horses... and kayaking.
Will also sent me a link to a photograph with the following explanation:
Alison is probably taking the picture, so not in the image. This is from a trip to the Middle Fork of the Salmon in Idaho she went at least 10 years ago. Its typical of the sort of thing she did. Normally closer to home on the east coast, but a couple major trips to special places.

Saturday, January 15, 2011

The Original Mandate of the Federal Reserve

Writing on his recently launched blog, my colleague Perry Mehrling traces the evolving mandate of the Federal Reserve from its founding to the present day:
From a longer historical perspective, populist targeting of the Fed, both from the right and from the left, is nothing new. Big Finance and Big Government are perennial bogeymen in American political discourse. Coupling the two in the institution of a central bank is at the heart of current debate about the role of the Fed during the crisis.

In 1913, at the founding of the Fed, legislators directly confronted both bogeymen. The whole idea of the Federal Reserve System, so the language of the Act made clear, was to channel credit preferentially to productive uses. Section 13(2) makes clear who was supposed to get the credit: “Discount of Commercial, Agricultural and Industrial Paper”, not speculative financial paper and not Treasury paper. The new Fed was about reversing the upper hand enjoyed by Big Finance, and without replacing it with the hand of Big Government.

Exigencies of war finance soon shifted the focus of the newborn Fed, and the Act was accordingly amended. During both World War I and World War II, the Fed pegged the price of Treasury debt, and expanded its balance sheet as necessary to absorb any excess supply that was not taken up by private buyers.

Does that kind of emergency intervention sound familiar? It should.

So-called QE1, back in early 2009, involved the Fed pegging the price of mortgage-backed securities by taking $1.25 trillion worth onto its own balance sheet. This is war finance. Actually it started even earlier, back in September 2008, with the collapse of Lehman and AIG. The initial balance sheet expansion occurred as, in addition to its domestic lending, the Fed lent $600 billion to foreign central banks, as well as other billions directly to foreign private banks, financing the loans simply by expanding its own monetary liabilities. This again is war finance, but without the war.

What troubles critics of the Fed is the use of the powerful tools of war finance to support private capital markets, and to support foreign bankers. For some, a similar unease arises from the latest QE2 twist, which has the Fed buying $600 billion of Treasury debt. There is no doubt in my mind that the Fed’s actions were legal under the “unusual and exigent circumstances” provision of the Act. But what everyone wants to know is whether the Fed did the right thing, and what the transformation of the Fed over the last few years portends for the future.
As it happens, one of the many responses of the Fed to the financial crisis was a return to its original mandate as an active participant in the commercial paper market: "funding purchases of commercial paper... to improve liquidity in short-term funding markets and thereby increase the availability of credit for businesses and households." But this was done in late October 2008, after a massive expansion of its balance sheet in support of failing financial intermediaries, and after TARP had been signed into law.

The main justification for these extraordinary measures in support of the financial sector was that perfectly solvent firms in the non-financial sector would have been crippled by the freezing of the commercial paper market. But as Dean Baker has consistently argued, had the Fed's intervention in the commercial paper market been more timely and vigorous, it might been unnecessary to provide unconditional transfers to insolvent financial intermediaries. While I do not subscribe to Baker's view that Ben Bernanke "deliberately misled" Congress in order to gain approval for TARP, his main point still stands: if the Fed can increase credit availability to non-financial businesses and households by direct purchases of commercial paper, than why is any financial institution too big to fail?

It's a question that the most ardent defenders of the bailouts would do well to address. The impressive numerical estimates of the effects of these policies on output and employment rely on a comparison with a "scenario based on no financial policy responses." But this is obviously not the proper benchmark. If output and employment could have been stabilized by direct support of the non-financial sector, then we would currently be faced with a different distribution of claims to this output, as well as a different distribution of financial practices.

Among supporters of the government's financial market policies, Bill Dudley has been especially forthright in acknowledging their flaws:
[It] is deeply offensive to Americans, including me, and runs counter to basic notions of justice and fairness, that some of the very same individuals and financial firms that precipitated this crisis have also benefited so directly from the response to the crisis. This has occurred at the same time that many Americans have lost their jobs and hard-earned savings. The public outrage this situation has produced is understandable. In the context of actions taken to support the financial system, the Federal Reserve and other government agencies have provided considerable support to banking organizations and other large systemically important financial institutions. The employees and executives of those institutions have benefited from our intervention. In a perfect world we would be able to prevent those individuals and institutions from benefiting; we would have a better way to penalize those who acted recklessly. But once the crisis was underway, one goal took precedence: keeping the financial system from collapsing in order to protect the nation from an even deeper and more protracted downturn that would have been more damaging to everyone.
In a perfect world, according to Dudley, we could have done much better. But even in our very imperfect world, might we not have been able to stabilize output and employment by returning quickly and forcefully to the original mandate of the Federal Reserve, to channel credit preferentially to productive uses?

Thursday, December 30, 2010

Should Old Acquaintance Be Forgot

Although I started this blog more than eight years ago, it lay largely dormant for most of this period and this has been my first full calendar year of (somewhat) regular posting. The experience has been consistently rewarding but occasionally exhausting. As the year draws to a close I'd like to acknowledge my debt to a few of the individuals whose writing I have enjoyed and learned from over the past twelve months, and to reflect upon some of the main ideas that have been explored in these pages.

Macroeconomic Resilience began the year as an anonymous blog but was subsequently revealed to be the creation of Ashwin Parameswaran, whose ecological perspective on behavior and markets is very close to my own. Every post of his is worth reading in full, but there is one on the trade-off between resilience and stability that remains an absolute favorite of mine.

Steve Randy Waldman's posts on interfluidity are generally so compelling and self-contained that there is usually very little left to add. I have been especially appreciative of a sequence of recent posts in which he argues that technocratic arguments, regardless of their merits, are unlikely to be persuasive if they are not consonant with our moral intuitions. It is the neglect of this important point that has so many commentators wondering why a policy that allegedly saved the financial system from collapse at negligible cost to the taxpayer is so deeply unpopular.

Along similar lines, Yves Smith on naked capitalism has been relentless in her criticism of TARP (and the unseemly self-congratulation of its architects) on the grounds that superior alternatives were available at the time. While there is plenty of room for debate on these points, it's a conversation that must be had, and one that has to consider the impact of the policy on the distribution of financial practices, as well as the outrage generated when moral intuitions are offended. It is essential that Yves (and her guests) continue to challenge the emerging academic consensus on the policy. 

One of the defining events of the year for me was the flash crash of May 6. Contrary to initial media reports, this was not the result of a fat finger or computer glitch -- it was the consequence of interacting trading strategies, most of which involved algorithmically implemented rapid responses to incoming market data for very short holding periods. In understanding the mechanics of the crash I benefited from comments posted by RT Leuchtkafer in response to an SEC concept release. One of these was published three weeks before the crash and turned out to be remarkably prescient. 

Viewed in isolation, the crash might be considered fairly inconsequential, and a recurrence could probably be prevented by implementing rule changes such as trading halts followed by call auctions. But the crash ought not to be viewed in isolation. Like the proverbial canary in a coalmine, it's importance lies in what it reveals about the manner in which trading strategies interact to produce major departures of prices from fundamentals from time to time. These more routine departures take longer to build and correct, are difficult to identify in real time, and leave their mark in the form of value and momentum effects, volatility clustering, and the fat tails of return distributions.

This view of speculative asset markets as a behavioral ecosystem in which the composition of stategies is a key determinant of market stability has also been advanced by David Merkel on The Aleph Blog. David's sequence of posts on what he calls "the rules" is well worth reading, and it was in response to his tenth rule that I wrote my first post on trading strategies and market efficiency. That was just a couple of weeks before the flash crash occurred and brought these ideas suddenly to life.

I am convinced that the non-fundamental volatility induced by the trading process has major effects on portfolio choice, risk-bearing, capital allocation, job creation and economic growth. Some possible mechanisms through which such effects can arise have been explored by David Weild and Edward Kim, and I thank David for bringing this work to my attention. I am also grateful to Terry Flanagan of Markets Media Magazine for an invitation to attend their Global Markets Summit where I witnessed a fascinating and combative debate on the broader economic effects of exchange-traded funds. 

On the issue of market efficiency I have tangled with Scott Sumner on multiple occasions. But his anniversary post on The Money Illusion really struck a chord with me. Scott has a talent for making complex ideas intelligible, and an ability to maintain a clear distinction between a model and the empirical phenomenon that it is designed to explain. His vision of the economy is coherent and he is a formidable intellectual adversary. His post made me even more optimistic about the ability of blogs to shape economic discourse in constructive ways.

My window to the world of economics and finance blogs is Economist's View. Mark Thoma somehow manages to be both comprehensive and highly selective in his choice of links, virtually all of which are worth following. But more importantly, his site is a wonderful clearinghouse for open debate on economic methodology, especially in relation to macroeconomics. His post on the dynamics of learning (featuring a video presentation by George Evans) was especially memorable, as was Brad DeLong's diagrammatic discussion of the topic.

Despite the recent flowering of behavioral and experimental economics, I believe that the level of methodological homogeneity in our profession is stifling. But the time may finally be ripe for the introduction of agent-based computational models into mainstream discourse. A problem with simulation-based approaches is that there are no commonly accepted criteria on the basis of which the robustness of any given set of results may be evaluated. This will change once there is an outstanding article in a leading journal that sets a standard that others can then adopt. Where will it come from? Based on my reading of ongoing work by Geanakoplos and Farmer, I suspect that it may emerge from this recently funded initiative at the Santa Fe Institute. That would be nice to see.

Although my posts here have dealt largely with economics and finance, I also have a deep personal interest in social identity and group inequality, especially in the American context. On this set of issues I have found no voice more incisive than that of Ta-Nehisi Coates, whose freshness of perspective and formidable powers of expression I find breathtaking. His post on Robert E. Lee was one of several spectacular pieces this year, and prompted me to respond with my own thoughts on cultural ancestry. Related themes have been explored in a series of fascinating dialogues between Glenn Loury and John McWhorter.

Finally, I am thankful for the numerous extraordinary comments that have been left here, many by individuals who manage superb blogs of their own. Joao Farinha on economic development, Barkley Rosser on bubbles and agent-based models, Kid Dynamite on the flash crash, Economics of Contempt on TARP, Nick Rowe on learning, Adam P on equilibrium, Andrew Gelman on dynamic graphs, 123 on exchange traded funds, Andrew Oh-Willeke on private equity and cultural founder effects, and JKH on maturity diversification come immediately to mind, but there are many, many others.

I could go on, in a futile attempt to acknowledge all those who have influenced me and taken the time and trouble to  respond either in comments here or on their own blogs. But this post has to end before the calendar year does, and this seems as good a time to stop as any.

A very Happy New Year to you all.

Saturday, December 11, 2010

Perspectives on Exchange-Traded Funds

Are exchange-traded funds good or bad for the market?

That was the title of a lively and interesting session at Markets Media's third annual Global Markets Summit last Thursday. The session was organized as an old-fashioned debate between two teams. On one side were David Weild and Harold Bradley (joined later by Robert Litan on video), who argued that heavily traded funds composed of relatively illiquid small-cap stocks were responsible, in part, for the sharp decline in initial public offerings over the past decade, with devastating consequences for capital formation and job creation.

Responding to these claims were Bruce Lavine, Adam Patti and Robert Holderith, all representing major sponsors of funds (WisdomTree, IndexIQ and EGShares respectively). The sponsors argued that they are marketing a product that is vastly superior to the traditional open-end fund, provides investors with significant liquidity, transparency and tax advantages, and is rapidly gaining market share precisely because of these benefits. From their perspective, it makes as little sense to blame exchange-traded funds for declining initial public offerings and the sluggish rate of job creation as it does to blame them for hurricanes or influenza epidemics.

So who is right?

Bradley and Litan have previously argued their position in a lengthy and data-filled report, and Wield has testified on the issue before the joint CFTC-SEC committee on emerging regulatory issues. Their argument, in a nutshell, is this: The prices of thinly traded stocks can become much more volatile as a result of inclusion in a heavily traded fund as a consequence of the creation and redemption mechanism. For instance, a rise in the price of shares in the fund relative to net asset value induces authorized participants to create new shares while simultaneously buying all underlying securities regardless of the relation between their current prices and any assessment of fundamental value. Similarly a fall in the fund price relative to net asset value can trigger simultaneous sales of a broad range of securities, resulting in significant price declines for relatively illiquid stocks. This process results not only in greater volatility but also in a sharply increased correlation of returns on individual stocks. The scope for risk-reduction through diversification is accordingly reduced, which in turn influences the asset allocation decisions of long term investors. The result is a reduction in the flow of capital to the smaller, more innovative segments of the market, with predictably dire consequences for job creation.

The sponsors do not deny the possibility of these effects, but argue that any mispricing in the markets for individual stocks represents a profit opportunity for alert fundamental traders, and that this should prevent prolonged or major departures of prices from fundamentals. But this is too sanguine an assessment. Fundamental research is costly and its profitability depends not only on the scale of mispricing that is uncovered but also on the size of the positions that can be taken in order to profit from it. Furthermore, since a significant proportion of trades are driven by the arbitrage activities of authorized participants, mispricing need not be quickly or reliably corrected. Both illiquidity and high volatility serve as a deterrent to fundamental research in such markets.

The problem, in other words, is real. But what I find puzzling about Bradley's position on this issue is that he seems unable (or unwilling) to recognize that precisely the same effects can be generated by high-frequency trading. As was apparent in an earlier session at the conference, he remains among the most vocal and fervent defenders of the new market makers. His justification for this is that spreads have declined dramatically, lowering the costs of trading for all market participants, including long term investors.

There is no doubt the costs of trading are a fraction of what they used to be, but a single-minded focus on spreads misses the big picture. It is worth bringing to mind John Bogle's wise words:
It is the iron law of the markets, the undefiable rules of arithmetic: Gross return in the market, less the costs of financial intermediation, equals the net return actually delivered to market participants.  
If spreads and costs per trade decline, but holding periods shrink to such a degree that overall trading expenditures rise (due to significantly increased volume), the net return to long term investors as a group must fall. Furthermore, if increases in volatility and correlation induce shifts in asset allocation that have the effect of reducing financing for small companies with high growth potential, then even gross returns could decline.

I have been arguing for a while now that the stability of an asset market depends on the composition of trading strategies and, in particular, that one needs a large enough share of information trading to ensure that prices track fundamentals reasonably well. But changes in technology and regulation have allowed technical strategies to proliferate, and high frequency trading is a significant part of this phenomenon. The predictable result is a secular increase in asset price volatity and an increased frequency of bubbles and crashes.

The flash crash of May 6 was just a symptom of this. Viewed in isolation, it was a minor event: prices fell (or rose, in some cases) to patently absurd levels, then snapped back within a matter of minutes. But the crash was the canary in the proverbial coal mine -- it was important precisely because it made visible what is ordinarily concealed from view. Departures of prices from fundamentals are routine events that, especially on the upside, are not quickly corrected. Some of the proposed responses to the crash that were favored at the conference -- such as trading halts followed by call auctions -- are cosmetic changes. They will have the effect of silencing the canary while doing nothing to lower toxicity in the mine.

It is the unremarkable, invisible, gradually accumulating departures of prices from fundamentals that are the real problem. These show up in the magnitude and clustering of asset price volatility and, through their effects on the composition of portfolios, leave their mark on the path of capital allocation, employment, and economic growth.

---

I am grateful to Terry Flanagan of Markets Media Magazine for the invitation to attend the summit.

I would also like to mention that the Kauffman report contains a number of assertions with which I disagree. For instance, Bradley and Litan endorse the claims of Bogan, Connor and Bogan that an exchange-traded fund with significant short interest could collapse with some investors unable to redeem their shares. This has been refuted very effectively by Steve Waldman in his comments on the Bogan post, and by Kid Dynamite. It is unfortunate that most responses to the report have focused on this dubious claim, rather than the more legitimate arguments that are advanced there.

---

Update (12/11). David Weild writes in to say:
I think we are seeing capital leave the microcap markets for a variety of reasons including:
  • Loss of liquidity providers
  • Emergence of ETFs (they don't buy IPOs and most don't buy follow-on offerings)
  • Indexing displacing fundamental investing (again, when this occurs, the funds stop investing in IPOs)
  • Loss of the retail broker as a stock seller.
If you don't have access to sufficient capital then capital formation, innovation and economic growth will suffer. That is clearly where we are.
I have also heard from someone who was once active in convincing the SEC to expand approval of ETF applications (and prefers to remain anonymous). He asserts that "the effects now being debated were certainly not an anticipated consequence. I can't remember a single conversation externally or internally at the SEC about whether the creation and redemption mechanism would increase correlations."

In hindsight it seems obvious that returns would become more highly correlated, but the fact that it was completely unanticipated at the time illustrates the enormous challenge of regulatory adaptation to financial innovation.

Wednesday, December 08, 2010

Building a Computational Model of the Crisis

A team of four researchers affiliated with the Santa Fe Institute has secured a grant from the Institute for New Economic Thinking to fund the development of an agent-based computational model of the financial crisis. The model will explicitly consider "housing and mortgage markets, banks and other financial institutions, securitization processes and hedge fund investors, manufacturing and service firms, and regulatory agencies," with the goal of discovering "the essential elements needed to reproduce the crisis, while investigating alternative policies that may have reduced its intensity and strategies for recovery."

It's an interesting and multidisciplinary group, composed of Doyne Farmer, John Geanakoplos, Peter Howitt and Robert Axtell. Genakoplos and Howitt are two of the most creative economists around, and I have discussed the work of the former on leverage and the latter on learning in earlier posts. Axtell is the co-author (with Joshua Epstein) of a fascinating book called Growing Artificial Societies, in which they develop an elaborate computational model of the interaction between a renewable resource base and the human population that depends on it. The model reproduces spatial patterns of resource depletion and recovery as well as population growth, migration and decline. Farmer is a physicist by training but has been working on finance for as long as I can remember. I discussed some of his work in an earlier post making a case for greater methodological pluralism in economics in general, and agent-based modeling in particular.

The team is looking for a graduate student or postdoctoral fellow to join them for a couple of years. For a young researcher interested in finance, the microfoundations of macroeconomics, and the agent-based computational methodology, this could be a fantastic opportunity.