Why Most Things Fail: Evolution, Extinction and Economics by Paul Ormerod
My rating: 5 of 5 stars
I think this is an important book, and I also think its message might be difficult for some people to reconcile their world view with. Ormerod sets his stall out to show that economists have presumed that the economy (and lots of other systems we think of all day every day) actually is a steady state, that markets exist as a relationship between buyers and sellers, with supply and demand following perfect curves that come from perfect knowledge.
Of course, as he points out, this is nonsense. Markets don’t exist in a steady state, there are too many factors that cause change. Ormerod cites work done that shows even if the steady state were to occur any kind of shock to it could take as long as a hundred years before it would stabilise again.
The classical view that comes all the way from Adam Smith assumes perfect knowledge, actors in the system need to know what every other actor is doing. This doesn’t happen, the textbook view that makes assumptions around marginal costs is bogus, a lot of the time businesses don’t know what they are, a lot of the time they don’t know what their competitors are doing, and as you can’t see into the future, even if they did it probably wouldn’t help because you still don’t know what your customers may want that you don’t do. Most businesses of any size or complexity tend to work using rules of thumb, and the MBA spreadsheets don’t help because they assume perfect information. Also, your customers might just not like what you have to offer this season.
He looks at work done on evolution. In particular work done by Raup shows that there is a power series relationship between the frequency of extinction events and their size. Other people have discovered that you can draw almost exactly the same curve if you look at business failures. This has some interesting consequences – if you play with these models and look at an arbitrary measure of ‘fitness’ in the Darwinian sense then a degree of cooperation is actually good for the long term viability of the system as a whole. Ultra competition forces prices down and isn’t good in the long run – neither is cost cutting.
The problem of perfect information is also addressed by looking at simple games, such as the Prisoner’s dilemma, played over many iterations and looking at what strategies win over the long term. As well as an arbitrary game involving where on a line you might place your ice cream stand to get the most customers. As soon as you have more than two players, and more than one time of entry into an existing market it becomes almost impossible to do more than work out what the graph of possible solutions to the problem is and understand the shape of it. If you are one of the players it’s hard to work out what to do.
Interestingly we have two extinction models – one is external shocks (the asteroid of dyno extinction fame), another is that a niche closes because of some other factor in the competition and a species dies out. Species are competing and cooperating (predators stop prey eating all of the available food and destroying the environment which means both species survive), so there could be a cascade from what looks like a relatively small cause. In fact, both happen, there is no either/or. But modelling this, predicting the future, becomes impossible. All you can do is look at the shape and work out how to cope with what may happen.
Ormerod concludes by giving an overview of the work of Schumpter and Hayek, that is often ignored.
The visions of the world articulated by orthodox economists and by Hayek are fundamentally different. Conventional theory describes a highly structured mechanical system. Both the economy and society are in essence gigantic machines, whose behaviour can be controlled and predicted. Hayek’s view is much more rooted in biology. Individual behaviour is not fixed, like a screw or cog in a machine is, but evolves in response to the behaviour of others. Control and prediction of the system as a whole is simply not possible.
Ormerod quotes several examples of systems coming up with robust solutions to problems (even the origins of the mighty US dollar) that weren’t obvious until they were left alone to find solutions themselves. A good example of this is the hub and spoke architecture of US domestic airlines that appeared after deregulation. It serves customer needs but no-one could have foreseen it at the time.
The central argument is that central planning doesn’t work and solutions that are workable and human come from creating environments where the actors can work together on solutions that benefit them. Essentially.
… it is innovation, evolution and competition which are the hallmarks of a successful system …
Schumpter coined the phrase ‘gales of creative destruction’. He argued that innovation led to such gales that the caused old ideas, technologies, skills and equipment to become obsolete. The question … was not ‘how capitalism administers existing structures … [but] how it creates and destroys them’. Creative destruction, he believed, caused continuous progress and improved standards of living for everyone.
This has serious and interesting consequences for policy makers – the models show that forcing too much competition between actors hurts the overall fitness of the system, and also protection does too. So most of the time they must resist the urge to exhort and fiddle – this comes right back to the work that Deming did all those years ago that no-one remembers and everyone should read. Four days with Doctor Deming
But if they resist the temptation to exhort and fiddle without enough information, if they don’t listen to lobbyists and make sure decisions are made locally by the people who know what’s needed. If they do all these things, we don’t need that many them and they don’t need to do much to keep things steady.
I’ve just started reading Antifragile, which addresses the other side of this equation, given that the world is very unpredictable isn’t it better to create institutions and systems that aren’t brittle from so much command and control mania and even benefit from small shocks and changes. I will return to this subject again in my next review. I think that I have accidentally stumbled on two books that complement each other. Ormerod’s work gives a mathematical, scientific background that looks across many disciplines to say some very similar things.
So, read this book, it will surprise you and leave you thinking about how we should do things so that failure is part of what happens but isn’t a catastrophe.