Prof. Jayanth R. Varma's Financial Markets Blog

About me       Latest Posts       Posts by Year       Posts by Categories

Computational and sociological analyses of financial modeling

I have been reading a number of papers that examine financial modeling in the context of the current crisis from a computational complexity and sociology of knowledge point of view:

I liked all these papers and learned a lot from each of them which is not the same as saying that I agree with all of them.

The paper that I liked most was Beunza and Stark which is really about cognitive interdependence and systemic risk. Their work is based on an ethnographic study of financial modeling carried out over a three year period at a top-ten global investment bank. Some of their conclusions are:

Using models in reverse, traders find out what their rivals are collectively thinking. As they react to this knowledge, their actions introduce a degree of interdependence ...

Quantitative tools and models thus give back with one hand the interdependence that they took away with the other. They hide individual identities, but let traders know what the consensus is. Arbitrageurs are thus not embedded in personal ties, but neither are they disentangled from each other.

Scopic markets are fundamentally different from traditional social settings in that the tool, not the network, is the central coordinating device.

Instead of ascribing crises to excessive risk-taking, misuse of the models, or irreflexive imitation, our notion of reflexive modeling offers an account of crises in which problems unfold in spite of repeated reassurances, early warnings, and an appreciation for independent thinking.

Implicit in the behavioral accounts of systemic risk is an emphasis on the individual biases and limitations of the investors. At the extreme, investors are portrayed as reckless gamblers, mindless lemmings, or foolish users of models they do not understand. By contrast, our detailed examination of the tools of arbitrage offers a theory of crisis that does not call for any such bias. The reflexive risks that we identified befall on arbitrageurs that are smart, creative, and reflexive about their own limitations.

Though the paper is written in a sociological language, what it most reminded me of was Aumann’s paper more than 30 years ago on “Agreeing to disagree” (The Annals of Statistics, 1976). What Beunza and Stark describe as reflexivity is closely related to Aumann’s celebrated theorem: “If two people have the same priors, and their posteriors for a given event A are common knowledge, then these posteriors must be equal.”

The Brigo et al paper is mathematically demanding as they take “an extensive technical path, starting with static copulas and ending up with dynamic loss models.” But it is very useful in explaining why the Gaussian copula model is still used in its base correlation formulation though its limitations have been known for several years. My complaint about the paper is that it focuses too much on the difficulties in fitting the Gaussian copula to observed market prices and too little on the difficulties of using it to estimate the impact of plausible stress events.

MacKenzie focuses on “evaluation cultures” which are broader than just models. They are “pockets of local consensus on how financial instruments should be valued.” He argues that “‘Greed’ – the egocentrically-rational pursuit of profits and bonuses – matters, but the calculations that the greedy have to make are made within evaluation cultures”. MacKanzie highlights “the peculiar status of the ABS CDO as what one might call an epistemic orphan – cognitively peripheral to both its parent cultures, corporate CDOs and ABSs.”

The Arora et al paper is probably the most mathematical of the lot. It essentially shows that an originator can put bad loans into CDOs in such a way that it is computationally infeasible for the investors to figure this out even ex post.

However, for a real-life buyer who is computationally bounded, this enumeration is infeasible. In fact, the problem of detecting such a tampering is equivalent to the so-called hidden dense subgraph problem, which computer scientists believe to be intractable ... Moreover, under seemingly reasonable assumptions, there is a way for the seller to ‘plant’ a set S of such over-represented assets in a way that the resulting pooling will be computationally indistinguishable from a random pooling.”

Furthermore, we can show that for suitable parameter choices the tampering is undetectable by the buyer even ex post. The buyer realizes at the end that the financial products had a higher default rate than expected, but would be unable to prove that this was due to the seller’s tampering.

The derivatives that Arora et al discuss are weird binary CDOs and my interpretation of this result is that in a rational market, these kinds of exotic derivatives would never be created or traded. Nevertheless, this is an important way of looking at how computational complexity can reinforce information asymmetry under certain conditions.

Posted at 6:10 pm IST on Thu, 21 Jan 2010         permanent link


Comments

Comments