In his analysis, "Narrow banking: the reform of banking regulations", John Kay wrote, in the context of near-collapse of the financial system last year:
"The issue of robustness is central. No systems, however well designed, can eliminate mistakes and failures. While good systems seek to reduce the likelihood of mistakes and failures, a central feature of all well designed engineering – and biological – systems is that they are robust to the failures that will inevitably occur.
Robust systems are structured so that failures can be contained within a single component, or so that error correction mechanisms come into play. In other interconnected utilities, such as water or electricity, substantial resource – both technical ingenuity and capital expenditure – is devoted to ensuring that such failsafe measure exist, which is why major disruptions are rare. Financial services are different. But they should not – and need not – be different.
Robust engineering systems are designed with modularity – so that one component can fail, and be replaced, with little damage to the whole. They have independent back up systems. They are loosely coupled, so that small disruptions are easily absorbed. All financial institutions apply these principles to their technology, but similar measures are not in place – or widely thought relevant – to substantive operations of these institutions, or for the financial system as the whole."
This is very refreshing but pretty much obvious to any scientist like mathematician, physicist, chemist, computer programmer or engineer. This is the crux of any logical or engineering design. For any first year student of engineering this is blindingly obvious concept.
However in the beginning of this year, Charles Goodhart argued that:
"One of the lessons of the recent crisis, a lesson for bankers and for regulators, is, hire fewer mathematicians and physicist who build models on the basis of data that they can observe over relatively short period, and hire a few more historians who know what can go wrong even if they don't necessarily have a good data basis to put into particular models".
Wishing good luck to historians, political scientists, social anthropologists and so on if indeed few more of them are hired by the banks, and they are in charge of designing modular complex failsafe systems. Is it not astonishing that they are not employed in a similar capacity in an automotive industry, shipbuilding or by airline manufacturers? It would be surprising if many of them understood what John Kay is writing about.