Friday, January 21, 2011

Finance; The Three-mile Island Analogy

The most concise thing I can say about Richard Bookstaber's A Demon of Our Own Design is that I'm sorry I didn't read it when it first came out back in 2007.  It's an imperfect book--a not-always-successful mix of investment anecdote (good, but not distinctive) together with some of the best analytical exposition of investment risk management that I've seen  anywhere.

I particularly liked his chapter on "Complexity, Tight Coupling, and Normal Accidents," where he builds  a model of financial meltdown on analogy to physical disasters such as the Three Mile Island and the incineration of the Challenger space shuttle.   Both categories,  he argues, embody instances of small errors magnified: "the components of a process are critically interdependent; they are linked with little room for error or time for recalibration or adjustment."  Note: not merely "complex."  The post office is complex.  A university is complex.  But a lost letter does not  paralyze the mail service, and a screwup in the French department is not likely to resonate over in genetics.  On the other hand, take something like Three Mile Island (or the Challenger) and you've got a system that can magnify the simplest ("normal") errors up to the edge (at least) of catastrophe.

It's a compelling argument and I can't do it justice here; it was made not less memorable because I read at 3 am when I couldn't sleep.  Having my file of nightmarish calamity, I idled over to the computer to distract myself with my news aggregator.  And what is the first thing I see?  Why lookie, here' Tim Harford at Slate.  And here's his headline for the day:

Three Mile Island, the Challenger Shuttle, and…Lehman Bros.?

Well, how about that. Of course the "Lehman" stuff was new--when Bookstaber published Demon, Lehman hadn't happened.  But aside from that--yes, I had the privilege of reading essentially the same argument a second time.  No mention of Bookstaber here so far as I can tell,  Harford does say "The connection between banks and nuclear reactors is not obvious to most bankers."  The "most" is a shrewd qualifier: at least one banker seems to know about it, enough to have written about it four years ago.

And it turns out  these two aren't alone. Later in the day, my attention was directed to James Surowiecki's "Financial Page" in The New Yorker for February 11, 2008:
In that sense, the potential collapse of monoline insurers looks like a classic example of what the sociologist Charles Perrow called a “normal accident.” In examining disasters like the Challenger explosion and the near-meltdown at Three Mile Island, Perrow argued that while the events were unforeseeable they were also, in some sense, inevitable, because of the complexity and the interconnectedness of the systems involved. When you have systems with lots of moving parts, he said, some of them are bound to fail. And if they are tightly linked to one another—as in our current financial system—then the failure of just a few parts cascades through the system. In essence, the more complicated and intertwined the system is, the smaller the margin of safety.
Today, as financial markets become ever more complex, these kinds of unanticipated ripple effects are more common ... . In the past thirty years, thanks to the combination of globalization, deregulation, and the increase in computing power, we have seen an explosion in financial innovation. This innovation has had all kinds of benefits—making cheap capital available to companies and individuals who previously couldn’t get it, allowing risk to be more efficiently allocated, and widening the range of potential investments. On a day-to-day level, it may even have lowered volatility in the markets and helped make the real economy more stable. The problem is that these improvements have been accompanied by more frequent systemic breakdowns. It may be that investors accept periodic disasters as a price worth paying for the innovations of modern finance, but now is probably not the best time to ask them about it.
Actually for my money, Surowiecki's presentation is pithier than either of his companions ((though not nearly so richly nuanced as Bookstaber's).  As to provenance--neither Surowiecki nor Harford cite Bookstaber.  All three cite Charles Perrow.  Bookstaber says that Perrow coined the term "normal accident,"  in his book Normal Accidents: Lving with High-Risk TechnologiesBookstaber also citesa study by Scott Sagan entitled The Limits of Safety; he says that "these two books provide the details and examples that are the basis of the discussion in this chapter.  I'm not clear whether either of these books discusses financial markets (I've seen only Google Books snippets, which are inconclusive).  Harford does attribute some remarks on finance to Perrow, though they sound like the may have come from an interview, not from a book.

Is anybody copying anybody here?  Aside from citations to Perrow, cited by all three finance writers, it's hard to say. Maybe Surowiecki heard it at a dinner party.  Maybe Harford ran into Perrow at an academic conference.  Maybe there's an ur-source that  underlies them all.  One way or another: "normal accidents" and financial meltdown: by this time, they're yoked for eternity.

Update:  My friend John reminds me that Steve Schwarcz from Duke has been writing about "tight coupling," with credit to Bookstaber. See, e.g., link.

1 comment:

Tim Harford said...

Tim Harford here, the author of the piece you're talking about. Charles Perrow wrote "Normal Accidents" in 1984, fully developing his theory with some of the examples I used in my piece. The first person I am aware of to apply the analysis of industrial accidents to finance was James Reason, in his 1997 book "Managing the Risks of Organizational Accidents". I interviewed both men for my article and read their books in great detail - hence my acknowledging their work. I didn't interview Rick Bookstaber and the ideas in my article are from Perrow and Reason, not Bookstaber.