Complexity, ‘fog and moonlight’, prediction, and politics III – von Neumann and economics as a science

The two previous blogs in this series were:

Part I HERE.

Part II HERE.

All page references unless otherwise stated are to my essay, HERE.

Since the financial crisis, there has been a great deal of media and Westminster discussion about why so few people predicted it and what the problems are with economics and financial theory.

Absent from most of this discussion is the history of the subject and its intellectual origins. Economics is clearly a vital area of prediction for people in politics. I therefore will explore some intellectual history to provide context for contemporary discussions about ‘what is wrong with economics and what should be done about it’.

*

It has often been argued that the ‘complexity’ of human behaviour renders precise mathematical treatment of economics impossible, or that the undoubted errors of modern economics in applying the tools of mathematical physics are evidence of the irredeemable hopelessness of the goal.

For example, Kant wrote in Critique of Judgement:

‘For it is quite certain that in terms of merely mechanical principles of nature we cannot even adequately become familiar with, much less explain, organized beings and how they are internally possible. So certain is this that we may boldly state that it is absurd for human beings even to attempt it, or to hope that perhaps some day another Newton might arise who would explain to us, in terms of natural laws unordered by any intention, how even a mere blade of grass is produced. Rather, we must absolutely deny that human beings have such insight.’

In the middle of the 20th Century, one of the great minds of the century turned to this question. John Von Neumann was one of the leading mathematicians of the 20th Century. He was also a major contributor to the mathematisation of quantum mechanics, created the field of ‘quantum logic’ (1936), worked as a consultant to the Manhattan Project and other wartime technological projects, and was one of the two most important creators of modern computer science and artificial intelligence (with Turing) which he developed partly for immediate problems he was working on (e.g. the hydrogen bomb and ICBMs) and partly to probe the general field of understanding complex nonlinear systems.  In an Endnote of my essay I discuss some of these things.

Von Neumann was regarded as an extraordinary phenomenon even by  the cleverest people in the world. The Nobel-winning physicist and mathematician Wigner said of von Neumann:

‘I have known a great many intelligent people in my life. I knew Planck, von Laue and Heisenberg. Paul Dirac was my brother in law; Leo Szilard and Edward Teller have been among my closest friends; and Albert Einstein was a good friend, too. But none of them had a mind as quick and acute as Jansci von Neumann. I have often remarked this in the presence of those men and no one ever disputed me… Perhaps the consciousness of animals is more shadowy than ours and perhaps their perceptions are always dreamlike. On the opposite side, whenever I talked with the sharpest intellect whom I have known – with von Neumann – I always had the impression that only he was fully awake, that I was halfway in a dream.’

Von Neumann also had a big impact on economics. During breaks from pressing wartime business, he wrote ‘Theory of Games and Economic Behaviour’ (TGEB) with Morgenstern. This practically created the field of ‘game theory’ which one sees so many references to now. TGEB was one of the most influential books ever written on economics. (The movie The Beautiful Mind gave a false impression of Nash’s contribution.) In the Introduction, his explanation of some foundational issues concerning economics, mathematics, and prediction is clearer for non-specialists than any other thing I have seen on the subject and cuts through a vast amount of contemporary discussion which fogs the issues.

This documentary on von Neumann is also interesting:

*

There are some snippets from pre-20th Century figures explaining concepts in terms recognisable through the prism of Game Theory. For example, Ampère wrote ‘Considerations sur la théorie mathématique du jeu’ in 1802 and credited Buffon’s 1777 essay on ‘moral arithmetic’ (Buffon figured out many elements that Darwin would later harmonise in his theory of evolution). Cournot discussed what would later be described as a specific example of a ‘Nash equilibrium’ viz duopoly in 1838.  The French mathematician Emile Borel also made contributions to early ideas.

However, Game Theory really was born with von Neumann. In December 1926, he presented the paper ‘Zur Theorie der Gesellschaftsspiele’ (On the Theory of Parlour Games, published in 1928, translated version here) while working on the Hilbert Programme [cf. Endnote on Computing] and quantum mechanics. The connection between the Hilbert Programme and the intellectual origins of Game Theory can perhaps first be traced in a 1912 lecture by one of the world’s leading mathematicians and founders of modern set theory, Zermelo, titled ‘On the Application of Set Theory to Chess’ which stated of its purpose:

‘… it is not dealing with the practical method for games, but rather is simply giving an answer to the following question: can the value of a particular feasible position in a game for one of the players be mathematically and objectively decided, or can it at least be defined without resorting to more subjective psychological concepts?’

He presented a theorem that chess is strictly determined: that is, either (i) white can force a win, or (ii) black can force a win, or (iii) both sides can force at least a draw. Which of these is the actual solution to chess remains unknown. (Cf. ‘Zermelo and the Early History of Game Theory’, by Schwalbe & Walker (1997), which argues that modern scholarship is full of errors about this paper. According to Leonard (2006), Zermelo’s paper was part of a general interest in the game of chess among intellectuals in the first third of the 20th century. Lasker (world chess champion 1897–1921) knew Zermelo and both were taught by Hilbert.)

Von Neumman later wrote:

‘[I]f the theory of Chess were really fully known there would be nothing left to play.  The theory would show which of the three possibilities … actually holds, and accordingly the play would be decided before it starts…  But our proof, which guarantees the validity of one (and only one) of these three alternatives, gives no practically usable method to determine the true one. This relative, human difficulty necessitates the use of those incomplete, heuristic methods of playing, which constitute ‘good’ Chess; and without it there would be no element of ‘struggle’ and ‘surprise’ in that game.’ (p.125)

Elsewhere, he said:

‘Chess is not a game. Chess is a well-defined computation. You may not be able to work out the answers, but in theory there must be a solution, a right procedure in any position. Now, real games are not like that at all. Real life is not like that. Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do. And that is what games are about in my theory.’

Von Neumman’s 1928 paper proved that there is a rational solution to every two-person zero-sum game. That is, in a rigorously defined game with precise payoffs, there is a mathematically rational strategy for both sides – an outcome which both parties cannot hope to improve upon. This introduced the concept of the minimax: choose a strategy that minimises the possible maximum loss.

Zero-sum games are those where the payoffs ‘sum’ to zero. For example, chess or Go are zero-sum games because the gain (+1) and the loss (-1) sum to zero; one person’s win is another’s loss. The famous Prisoners’ Dilemma is a non-zero-sum game because the payoffs do not sum to zero: it is possible for both players to make gains. In some games the payoffs to the players are symmetrical (e.g. Prisoners’ Dilemma); in others, the payoffs are asymmetrical (e.g. the Dictator or Ultimatum games). Sometimes the strategies can be completely stated without the need for probabilities (‘pure’ strategies); sometimes, probabilities have to be assigned for particular actions (‘mixed’ strategies).

While the optimal minimax strategy might be a ‘pure’ strategy, von Neumann showed it would often have to be a ‘mixed strategy’ and this means a spontaneous return of probability, even if the game itself does not involve probability.

‘Although … chance was eliminated from the games of strategy under consideration (by introducing expected values and eliminating ‘draws’), it has now made a spontaneous reappearance. Even if the rules of the game do not contain any elements of ‘hazard’ … in specifying the rules of behaviour for the players it becomes imperative to reconsider the element of ‘hazard’. The dependence on chance (the ‘statistical’ element) is such an intrinsic part of the game itself (if not of the world) that there is no need to introduce it artificially by way of the rules of the game itself: even if the formal rules contain no trace of it, it still will assert itself.’

In 1932, he gave a lecture titled ‘On Certain Equations of Economics and A Generalization of Brouwer’s Fixed-Point Theorem’. It was published in German in 1938 but not in English until 1945 when it was published as ‘A Model of General Economic Equilibrium’. This paper developed what is sometimes called von Neumann’s Expanding Economic Model and has been described as the most influential article in mathematical economics. It introduced the use of ‘fixed-point theorems’. (Brouwer’s ‘fixed point theorem’ in topology proved that, in crude terms, if you lay a map of the US on the ground anywhere in the US, one point on the map will lie precisely over the point it represents on the ground beneath.)

‘The mathematical proof is possible only by means of a generalisation of Brouwer’s Fix-Point Theorem, i.e. by the use of very fundamental topological facts… The connection with topology may be very surprising at first, but the author thinks that it is natural in problems of this kind. The immediate reason for this is the occurrence of a certain ‘minimum-maximum’ problem… It is closely related to another problem occurring in the theory of games.’

Von Neumann’s application of this topological proof to economics was very influential in post-war mathematical economics and in particular was used by Arrow and Debreu in their seminal 1954 paper on general equilibrium, perhaps the central paper in modern traditional economics.

*

In the late 1930’s, von Neumann, based at the IAS in Princeton to which Gödel and Einstein also fled to escape the Nazis, met up with the economist Oskar Morgenstern who was deeply dissatisfied with the state of economics. In 1940, von Neumann began his collaboration on games with Morgenstern, while working on war business including the Manhattan Project and computers, that became The Theory of Games and Economic Behavior (TGEB). By December 1942, he had finished his work on this though it was not published until 1944.

In the Introduction of TGEB, von Neumann explained the real problems in applying mathematics to economics and why Kant was wrong.

‘It is not that there exists any fundamental reason why mathematics should not be used in economics.  The arguments often heard that because of the human element, of the psychological factors etc., or because there is – allegedly – no measurement of important factors, mathematics will find no application, can all be dismissed as utterly mistaken.  Almost all these objections have been made, or might have been made, many centuries ago in fields where mathematics is now the chief instrument of analysis [e.g. physics in the 16th Century or chemistry and biology in the 18th]…

‘As to the lack of measurement of the most important factors, the example of the theory of heat is most instructive; before the development of the mathematical theory the possibilities of quantitative measurements were less favorable there than they are now in economics.  The precise measurements of the quantity and quality of heat (energy and temperature) were the outcome and not the antecedents of the mathematical theory…

‘The reason why mathematics has not been more successful in economics must be found elsewhere… To begin with, the economic problems were not formulated clearly and are often stated in such vague terms as to make mathematical treatment a priori appear hopeless because it is quite uncertain what the problems really are. There is no point using exact methods where there is no clarity in the concepts and issues to which they are applied. [Emphasis added] Consequently the initial task is to clarify the knowledge of the matter by further careful descriptive work. But even in those parts of economics where the descriptive problem has been handled more satisfactorily, mathematical tools have seldom been used appropriately. They were either inadequately handled … or they led to mere translations from a literary form of expression into symbols…

‘Next, the empirical background of economic science is definitely inadequate. Our knowledge of the relevant facts of economics is incomparably smaller than that commanded in physics at the time when mathematization of that subject was achieved.  Indeed, the decisive break which came in physics in the seventeenth century … was possible only because of previous developments in astronomy. It was backed by several millennia of systematic, scientific, astronomical observation, culminating in an observer of unparalleled calibre, Tycho de Brahe. Nothing of this sort has occurred in economics. It would have been absurd in physics to expect Kepler and Newton without Tycho – and there is no reason to hope for an easier development in economics…

‘Very frequently the proofs [in economics] are lacking because a mathematical treatment has been attempted in fields which are so vast and so complicated that for a long time to come – until much more empirical knowledge is acquired – there is hardly any reason at all to expect progress more mathematico. The fact that these fields have been attacked in this way … indicates how much the attendant difficulties are being underestimated. They are enormous and we are now in no way equipped for them.

‘[We will need] changes in mathematical technique – in fact, in mathematics itself…  It must not be forgotten that these changes may be very considerable. The decisive phase of the application of mathematics to physics – Newton’s creation of a rational discipline of mechanics – brought about, and can hardly be separated from, the discovery of the infinitesimal calculus…

‘The importance of the social phenomena, the wealth and multiplicity of their manifestations, and the complexity of their structure, are at least equal to those in physics.  It is therefore to be expected – or feared – that mathematical discoveries of a stature comparable to that of calculus will be needed in order to produce decisive success in this field… A fortiori, it is unlikely that a mere repetition of the tricks which served us so well in physics will do for the social phenomena too.  The probability is very slim indeed, since … we encounter in our discussions some mathematical problems which are quite different from those which occur in physical science.’

Von Neumann therefore exhorted economists to humility and the task of ‘careful, patient description’, a ‘task of vast proportions’. He stressed that economics could not attack the ‘big’ questions – much more modesty is needed to establish an exact theory for very simple problems, and build on those foundations.

‘The everyday work of the research physicist is … concerned with special problems which are “mature”… Unifications of fields which were formerly divided and far apart may alternate with this type of work. However, such fortunate occurrences are rare and happen only after each field has been thoroughly explored. Considering the fact that economics is much more difficult, much less understood, and undoubtedly in a much earlier stage of its evolution as a science than physics, one should clearly not expect more than a development of the above type in economics either…

‘The great progress in every science came when, in the study of problems which were modest as compared with ultimate aims, methods were developed which could be extended further and further. The free fall is a very trivial physical example, but it was the study of this exceedingly simple fact and its comparison with astronomical material which brought forth mechanics. It seems to us that the same standard of modesty should be applied in economics… The sound procedure is to obtain first utmost precision and mastery in a limited field, and then to proceed to another, somewhat wider one, and so on.’

Von Neumann therefore aims in TGEB at ‘the behavior of the individual and the simplest forms of exchange’ with the hope that this can be extended to more complex situations.

‘Economists frequently point to much larger, more ‘burning’ questions…  The experience of … physics indicates that this impatience merely delays progress, including that of the treatment of the ‘burning’ questions. There is no reason to assume the existence of shortcuts…

‘It is a well-known phenomenon in many branches of the exact and physical sciences that very great numbers are often easier to handle than those of medium size. An almost exact theory of a gas, containing about 1025 freely moving particles, is incomparably easier than that of the solar system, made up of 9 major bodies… This is … due to the excellent possibility of applying the laws of statistics and probabilities in the first case.

‘This analogy, however, is far from perfect for our problem. The theory of mechanics for 2,3,4,… bodies is well known, and in its general theoretical …. form is the foundation of the statistical theory for great numbers. For the social exchange economy – i.e. for the equivalent ‘games of strategy’ – the theory of 2,3,4… participants was heretofore lacking. It is this need that … our subsequent investigations will endeavor to satisfy. In other words, only after the theory for moderate numbers of participants has been satisfactorily developed will it be possible to decide whether extremely great numbers of participants simplify the situation.’

[This last bit has changed slightly as I forgot to include a few things.]

While some of von Neumann’s ideas were extremely influential on economics, his general warning here about the right approach to the use of mathematics was not widely heeded.

Most economists initially ignored von Neumann’s ideas.  Martin Shubik, a Princeton mathematician, recounted the scene he found:

‘The contrast of attitudes between the economics department and mathematics department was stamped on my mind… The former projected an atmosphere of dull-business-as-usual conservatism… The latter was electric with ideas… When von Neumann gave his seminar on his growth model, with a few exceptions, the serried ranks of Princeton economists could scarce forebear to yawn.’

However, a small but influential number, including mathematicians at the RAND Corporation (the first recognisable modern ‘think tank’) led by John Williams, applied it to nuclear strategy as well as economics. For example, Albert Wohlstetter published his Selection and Use of Strategic Air Bases (RAND, R-266, sometimes referred to as The Basing Study) in 1954. Williams persuaded the RAND Board and the infamous SAC General Curtis LeMay to develop a social science division at RAND that could include economists and psychologists to explore the practical potential of Game Theory further. He also hired von Neumann as a consultant; when the latter said he was too busy, Williams told him he only wanted the time it took von Neumann to shave in the morning. (Kubrick’s Dr Strangelove satirised RAND’s use of game theory.)

In the 1990’s, the movie A Beautiful Mind brought John Nash into pop culture, giving the misleading impression that he was the principle developer of Game Theory. Nash’s fame rests principally on work he did in 1950-1 that became known as ‘the Nash Equilibrium’. In Non-Cooperative Games (1950), he wrote:

‘[TGEB] contains a theory of n-person games of a type which we would call cooperative. This theory is based on an analysis of the interrelationships of the various coalitions which can be formed by the players of the game. Our theory, in contradistinction, is based on the absence of coalitions in that it is assumed each participant acts independently, without collaboration or communication with any of the others… [I have proved] that a finite non-cooperative game always has at least one equilibrium point.’

Von Neumann remarked of Nash’s results, ‘That’s trivial you know. It’s just a fixed point theorem.’ Nash himself said that von Neumann was a ‘European gentleman’ but was not impressed by his results.

In 1949-50, Merrill Flood, another RAND researcher, began experimenting with staff at RAND (and his own children) playing various games. Nash’s results prompted Flood to create what became known as the ‘Prisoners’ Dilemma’ game, the most famous and studied game in Game Theory. It was initially known as ‘a non-cooperative pair’ and the name ‘Prisoners’ Dilemma’ was given it by Tucker later in 1950 when he had to think of a way of explaining the concept to his psychology class at Stanford and hit on an anecdote putting the payoff matrix in the form of two prisoners in separate cells considering the pros and cons of ratting on each other.

The game was discussed and played at RAND without publishing. Flood wrote up the results in 1952 as an internal RAND memo accompanied by the real-time comments of the players. In 1958, Flood published the results formally (Some Experimental Games). Flood concluded that ‘there was no tendency to seek as the final solution … the Nash equilibrium point.’ Prisoners’ Dilemma has been called ‘the E. coli of social psychology’ by Axelrod, so popular has it become in so many different fields. Many studies of Iterated Prisoners’ Dilemma games have shown that generally neither human nor evolved genetic algorithm players converge on the Nash equilibrium but choose to cooperate far more than Nash’s theory predicts.

Section 7 of my essay discusses some recent breakthroughs, particularly the paper by Press & Dyson. This is also a good example of how mathematicians can invade fields. Dyson’s professional fields are maths and physics. He was persuaded to look at the Prisoners’ Dilemma. He very quickly saw that there was a previously unseen class of strategies that has opened up a whole new field for exploration. This article HERE is a good summary of recent developments.

Von Neumann’s brief forays into economics were very much a minor sideline for him but there is no doubt of his influence. Despite von Neumann’s reservations about neoclassical economics, Paul Samuelson admitted that, ‘He darted briefly into our domain, and it has never been the same since.’

In 1987, the Santa Fe Institute, founded by Gell Mann and others, organised a ten day meeting to discuss economics. On one side, they invited leading economists such as Kenneth Arrow and Larry Summers; on the other side, they invited physicists, biologists, and computer scientists, such as Nobel-winning Philip Anderson and John Holland (inventor of genetic algorithms). When the economists explained their assumptions, Phil Anderson said to them, ‘You guys really believe that?

One physicist later described the meeting as like visiting Cuba – the cars are all from the 1950’s so on one hand you admire them for keeping them going, but on the other hand they are old technology; similarly the economists were ingeniously using 19th Century maths and physics on very out-of-date models. The physicists were shocked at how the economists were content with simplifying assumptions that were obviously contradicted by reality, and they were surprised at the way the economists seemed unconcerned about how poor their predictions were.

Twenty-seven years later, this problem is more acute. Some economists are listening to the physicists about fundamental problems with the field. Some are angrily rejecting the physicists’ incursions into their field.

Von Neumann explained the scientifically accurate approach to economics and mathematics. [Inserted later. I mean – the first part of his comments above that discusses maths, prediction, models, and economics and physics. As far as I know, nobody seriously disputes these comments – i.e. that Kant and the general argument that ‘maths cannot make inroads into economics’ are wrong. The later comments about building up economic theories from theories of 2, 3, 4 agents etc is a separate topic. See comments.] In other blogs in this series I will explore some of the history of economic thinking as part of a description of the problem for politicians and other decision-makers who need to make predictions.

Please leave corrections and comments below.

 

8 thoughts on “Complexity, ‘fog and moonlight’, prediction, and politics III – von Neumann and economics as a science

  1. Hi Dom

    Only having read through twice, I don’t expect that this comment reflects much understanding. But this is where my problem lies:

    “Von Neumann therefore exhorted economists to humility and the task of ‘careful, patient description’, a ‘task of vast proportions’. He stressed that economics could not attack the ‘big’ questions – much more modesty is needed to establish an exact theory for very simple problems, and build on those foundations.”

    It is this last sentence that I would dispute, which is also echoed in the last quote you provide from VN:

    “In other words, only after the theory for moderate numbers of participants has been satisfactorily developed will it be possible to decide whether extremely great numbers of participants simplify the situation.’”

    Here is what I think. The crisis to which you allude in your very first paragraphs is, in my view, a macro economic crisis. It is not like, say, a computer malfunction, or the destruction of the Death Star where one small fault can chaotically snowball into an apocalypse. Instead, it is the result of certain massive, and relatively knowable, aggregates failing to rise at the right pace. In fact, for all the millions of words written about the Financial Crisis, all the opportunistic attempts by both Left and Right to use it for their rhetorical ends, it might be described in fairly mundane terms. (for a straightforward statement of this view, see http://thefaintofheart.wordpress.com/2014/09/10/many-still-believe-the-great-recession-was-the-result-of-a-modeling-error/)

    A crude analogy. Consider a boating lake, full of bobbing boats of all sizes. Suddenly they start crashing around into each other, some sinking, some scraping the bottom, some capsizing. No expert with no conceivable computer could predict their chaotic movements, even with a perfect “theory of 2,3,4 participants” – as you clearly illustrate with your earlier discussion of chess games. But it turns out that this is happening because, suddenly, a third of the water was drained from the lake. While foreknowledge of this draining might not give anyone an ability to determine what would happen exactly to each boat, the knowledge that something catastrophic would happen, should the water be drained in that way, requires little theory. Above all, it doesn’t require any theory needing to be built up from the patient study of objects of 2,3 4 or so in complexity.

    In other words, while macro doesn’t require the accurate summation of lots of micro.

    Macro is more like the theory of gas that you allude to. It was macro that failed. It was an avoidable failure, in my view but not in the view of the mainstream. But you and VN dispute this; you seem to feel that this is yet to be established. I think it is probably the case that aggregate relationships in economics *are* quite well established. For example, the relationship between hours worked, the total envelope of labour income, the behaviour of wages, is all fairly simple.

    But many of the mechanisms that lead to causal relations between those aggregates are not so well known or are disputed – or are crucially dependent on the views of participants. For example a massive temporary money injection can achieve little; a smaller injection believed to be permanent may have enormous effects. Multiple equilibria abound.

    None of this is to dispute the quite valid incredulity displayed by the physicists in 1987 that you end with. I suspect that there is a fair amount of economics out there trying to be like physics and making an a** of itself. My folksy examples and “market monetarist” leanings suggest that I’d agree with you that “model it or it isn’t real” is not the way forward for economists trying to explain the crisis. But this also means that VN’s demand that we build up from theories of 2,3, and 4 to theories of the aggregate is just the wrong way to go.

    Like

    • In this sentence ‘In other words, while macro doesn’t require the accurate summation of lots of micro’ – is the ‘while’ redundant or did something get chopped off?

      Not understanding much about economics I doubt I grasp your point properly but…

      1. I don’t think you are arguing with the first half of VN’s comments. I.e. Kant was wrong, some standard arguments against the use of maths in economics and other social sciences misunderstand the role of maths in physics and other physical sciences. As far as I know, nobody thinks VN is wrong about this and I’m assuming you aren’t arguing with these thoughts.

      2. Your argument is re the second half – i.e. economics needs to be built up from accurate models of very simple exchanges etc.

      3. But re your comments about macro… Isn’t the fundamental problem with macro that it is based on a whole set of assumptions around ‘equilibrium’ that we KNOW are false. Many of these VN and others pointed out back in the 1940s/50s when Samuelson, Arrow et al were doing their thing. The economy – and financial markets! – do not have a tendency to equilibrium. Indeed their dynamics lead to spontaneous dis-equilibrium – i.e. bubbles and crashes. So I see your point about the lake, but macro seems to be a theory that assumes weird things like, if you suck a third of the water out the level will fall perfectly uniformly across the whole lake!? If you have a macro theory like that, then you will assume all the boats will just uniformly sink together, etc.

      4. You write, ‘aggregate relationships in economics *are* quite well established. For example [XYZ] is all fairly simple.’ But if they were ‘well established’ and ‘all fairly simple’, and like a statistical theory of gas (which does work), should we not be able to make good predictions about them?

      5. If I understand right, some of the stuff physicists are doing now with ‘agent-based models’ and what not seems more attuned to what von Neumann was suggesting in the 1940s, particularly given his work on simulations in the first computers he was building in Princeton etc.

      D

      Like

      • Thanks Dom

        3. I am not the authority to judge this, but I don’t think these sorts of assumptions about uniformity are intrinsic to macro at all. Certainly, the possibility of discontinuities etc has been acknowledged for a while, and the “uniform boat drop” is probably not much of an issue; economics modellers will know that some sectors are more geared to the cycle, for example, and be able to adjust accordingly. The particular case of finance isn’t settled, I am sure, and there are those who see it driving everything (eg Martin Wolf?) and those who see it as more of a consequence of the macro than a driver, or a mirror through which it is all seen. But I do know there has been significant work on this area (e.g. Bernanke and Gertler, I think?).

        On the equilibrium point, again I have to disavow any expertise, but I think the fluid way macro is done merely suggests that the system is always moving towards the equilibrium, not necessarily always at it. So, say, a massive rise in government spending ought to raise equilibrium real interest rates – this sets in motion various economic impulses – but the exact point of that equilibrium isn’t necessarily reached (because of randomness, other factors moving, etc). Enough to know that pulling that lever produces that tendency

        4. I think general principles of weather may be known but longer term weather forecasts still difficult, because there are so many intermeshing factors. Predictability of outcomes is not a good test – a good sense of what the relationship between various variables may end up being is maybe the best we can expect.

        Gotta dash, thanks for this

        Giles

        Like

  2. Too many important ideas here to comment adequately on them all, but some thoughts.

    There is not one set of mathematical approaches to economics but many.

    I’ll leave to other to comment the statistical and mathematical approaches to macroeconomics, which is the area most commonly criticised for poor prediction, the area in which the public and the policy world are most interested in what economics can teach, and the area in which there is least consensus among economists about the right approaches.

    In microeconomics, Dom points out that von Neumann provided an important building block in the development of the Arrow-Debreu-McKenzie model of general equilibrium. This model shows us that the circumstances in which Adam Smith’s ‘invisible hand’ works perfectly are very special. Nevertheless, the mass of conventional microeconomic theory of which Arrow-Debreu-McKenzie is the idealised core is moderately successful in understanding how many real-world markets work. If you want to think about how the growth of the Chinese economy might affect income distribution in the UK, or how the London congestion charge might affect travel patterns, there’s a toolbox that works moderately well, and is (at least among professional economists) reasonably uncontroversial in broad terms.

    An approach whose fundamentals are based on an idealised view of competition, with perfect information and foresight, seems to work moderately well in understanding a world in which competition is not perfect, and there is lots of imperfect information and uncertainty. Specifically, it works reasonably well in describing markets and situtation in which we can largely ignore von Neumann’s concern about ‘strategic’ thinking: we don’t “worry about what the other man is going to do”. We don’t, in other words, need game theory in which the central theme is precisely about worrying what the other “man” is going to do. Agents in the standard microeconomic model are small relative to the market.

    This changes completely when we deal with issues where agents have to worry about how their actions interact with the actions of other agents – the central concern of game theory. Solution of even the simplest games, for example getting to the Nash equilibrium in a two-agent Cournot model, require assumptions of sophisticated rational thought by the agents. Only slightly more complex games require highly implausible levels of strategic thinking, and there’s now a large body of experimental evidence that human subjects don’t behave ‘rationally’ in the game-theoretic sense.

    von Neumann was surely right that the right starting point was the application of game theoretic thinking to small models. And there are impressive achievements. The prisoners’ dilemma model is a powerful aid to understanding a huge range of important social and economic problems – over-exploitation of common resources, congestion, and pollution; and the repeated prisoners’ dilemma takes us into other interesting areas of understanding. Nash equilibrium enabled a proper understanding of the Cournot model and from that has flowed a rich and diverse set of models of oligopolistic markets, which, for example, provide a robust and practical understanding of cartel behaviour. One important set of developments not mentioned by Dom is simple dynamic games, notably the work of Thomas Schelling. Here we get insights into truly strategic behaviour: first-mover advantage, “tying oneself to the mast”, “burning the bridges”, with serious applications which include the strategy of nuclear deterrence.

    Small-model game theory seems therefore to have been very successful. But I share the scepticism of freethinkingeconomist about von Neumann’s programme of building a more scientific economics up from the base of small-model game theory.

    A couple of concluding questions about the Santa Fe physicists. What was it about the economists’ assumptions that they found so incredible. Presumably rational behaviour? But, as I have sugggested above, the rational behaviour assumption works not badly in the standard microeconomic models, perhaps analogously to the frictionless assumptions used in much standard mechanics. Rationality in game theory models is a different issue – it does indeed seem to be deeply problematic beyond the very simplest models.

    And what’s wrong with the economists’ mathematical tools? Yes, they are rooted in 19th century analysis (the tools of Newtonian physics), overlaid with some of the tools that emerged from 20th century linear and non-linear programming (Farkas lemma and separating hyperplanes), plus of course game theory in its various branches. But the Cuban cars analogy suggests that the physicists thought that there are bodies of more recent mathematics that would be a better toolbox than economists currently have. I’m sceptical about this – having seem too many ill-considered attempts to apply techniques from other disciplines to economics, without enough understanding of what the economics questions were to which answers were sought. (If I were asked which box of tools I’d be interested in importing into economics, it would be the tools of computer gaming, not those of 21st century physics.)

    Like

    • Probably what they mean is that the mathematical techniques used in physics have moved on a lot over the last 100 years but that does not seem to have happened to a similar extent in economics, not necessarily that the economists should be using the exact same tools as the physicists.

      Like

  3. Just a quick note to say thanks for your blog posts, Dominic. I am a non-economist who has worked in Westminster, at a think tank, and recognise many of the frustrations you identify with poor policymaking, both in the use of evidence and in over-confidence in models (my specialism is energy and environment policy).

    I am making my way through Beinhocker’s Origin of Wealth at the moment and it is expressing clearly many of the ideas of complexity and non-linearity. They are extremely powerful approaches. As one who spends time thinking about how you can improve environmental systems, there are a lot of resonances.

    Keep up the good work.

    Like

Leave a comment