Posts Categorized: Risk assessment
Back to Basics
July 30 2019
Ever since the financial crisis started there has been a plethora of explanations about why traders and bankers behaved as they did. Some have been purely descriptive: what happened and when, allowing us to marvel at the folly of it all, at least in hindsight. At the time these clever financiers were praised by pretty much everyone from Chancellors down. There were very few pointing out at the time that the Emperor had no clothes.
But increasingly there have been attempts to use the insights gleaned from other disciplines to explain why what happened in the way it did. The latest neuroscientific findings were used to describe the biology of boom and bust (The Hour Between Dog and Wolf, for instance). Behavioural economics has had its say, as has nudge theory. Rather than nudging people to behave well, all the payment and reward incentives nudged financiers into doing what suited them financially irrespective of the effect on the customer and no matter what the expressed good intentions of the firm were. Goodness! Whoever could have predicted that, without a theory to explain it.
Psychologists have had their say, of course, though only a cynic might wonder about how much actual knowledge about the realities of life in the financial sector they have. No matter: all could opine merrily on the importance of culture in finance and on all the wonderful insights that these disciplines could bring to those seeking to manage and regulate the financial sector.
And now the anthropologists have got in on the act, as in this article by Gillian Tett. In it she points out how anthropologists have tried to analyse the cultural patterns, the rituals and symbols, even the words people use in finance to understand what was going on under the surface. In truth, the insights brought by anthropologists (at least as described here) are pretty obvious rather than thought-provoking; the article does not need them to be worth reading.
What is interesting, though, is how commentators on finance and perhaps also regulators are, perhaps unconsciously, making the same mistake as many of those traders and bankers. They are over-complicating, coming up with all sorts of theories and hypotheses apparently grounded in science or other social studies, described and interpreted by experts, using technical language to describe common human behaviours. Just as too many traders developed over-complicated products which they only half-understood and managers kidded themselves into believing that they had found a foolproof solution to valuation or risk management or any of the other difficult tasks they had, so there is a risk of developing overly complex explanations for why so many people behaved so stupidly or worse. The risk is that the more complex the explanation, the more people feel that it is all too difficult really to do anything about it or that this is something best left to the culture specialists, psychologists and other “ologists“.
Keep it simple might be the motto. In the end, by whatever means the conclusions are reached, what everyone in finance needs to remember is this:-
- Trust is at the heart of finance.
- Everyone in a financial institution is, in one way or another, managing risk. There is no such thing as a risk-free product or institution. Or, indeed, individual. Understanding the risk you are running and managing it properly is what every bank, every employee in a bank, every customer of a bank, every shareholder in a bank, every investor in a financial product and every regulator of a bank is doing. Or ought to be doing.
- Understanding properly is hard work. There is no magic bullet, algorithm, theory, process, spreadsheet, AI or killer piece of management information which will do it for you. Thinking is often required.
- There is no way of eliminating risk. Mitigating and minimising it: yes. Eliminating it: no. If anyone says otherwise (and much of the financial crisis was caused as a result of clever people thinking they had done just this and learning, painfully, that they hadn’t) they’re a charlatan or worse.
- Human beings, even clever ones (particularly them, it sometimes seems) do not behave rationally around money. Money and emotions are bosom pals. As any decent novelist or lawyer dealing with divorces or wills will tell you. The “animal spirits” Keynes described do not just apply to market participants but to all of us.
- Managing people, understanding them, motivating them, inspiring and leading them, teaching them, setting them a good example, setting them high expectations and making it clear what the boundaries are, what behaviour will not be accepted, what crosses the line, helping them get past their frailties, working effectively with them is hard work, the hardest work anyone ever has to do. And by far the most valuable – and rewarding.
- Finance is there to serve others, not itself. It is a means to an end and the moment it (and the people in it) start thinking of themselves as indispensable, as set apart from the society they are part of, as entitled to special consideration and immunity from challenge is the moment when hubris sets in. Nemesis will surely follow.
A Risky Business
September 16 2018
According to this survey (taken this August), only 3% of people had a very positive view of financial services, with 57% having a very or somewhat negative view. And all this 11 years after the run on Northern Rock and a decade after the Lehman’s bankruptcy, the bailout of RBS, the Lloyds takeover of HBoS and the disappearance of venerable institutions redolent of Britain’s sober manufacturing past, such as the Bradford & Bingley Building Society. One might have thought that a decade would have been enough for people to forget what happened. But like an itch that continues to be scratched, banks have, right up to the present day, provided many more examples justifying customers’ perennial exasperation with financial services providers: closure of branches, endless IT problems, the continuing PPI mis-selling saga, interest rates for savers still at rock bottom, mis-selling and mis-advice over pensions. Even the much vaunted culture change programmes embarked on by many banks don’t seem to have changed perceptions, possibly because some of the sector’s leaders have not fully appreciated that this applies to them too.
The 10-year anniversary has brought out two figures from the past to give their take on where we are now and, in so doing, they managed to compliment themselves (without seeming to, unless that was the point of the exercise) on their past successes. The first was Gordon Brown, the Prime Minister in charge when the crisis struck and famous for having claimed in Parliament that his efforts “saved the world” or its banks, anyway. Certainly, the efforts of his government in autumn 2008 prevented the failure of the entire British banking system. Would it be uncharitable to consider what responsibility his government (and the previous government in which he served as Chancellor) had for the state in which banks found themselves that autumn? Had earlier warning signals perhaps been ignored by regulators? Still, his claim that a more fractured system of political governance might make it harder for governments to co-operate should another financial meltdown occur is well made. It is not just within financial institutions that silos can prevent those at the top seeing the full picture; the same can happen at governmental and regulatory levels too.
And so to Bob Diamond, never shy about arguing the case for aggressive investment banks and the need to take risk, who popped up on the radio last week to tell us that we should view Barclays (which did not get government funding) very differently to RBS, which did. Possibly a touch premature, given that the SFO trial of senior Barclays executives in relation to Barclays’ capital raising that autumn is not due to start until January 2019. (Even Diamond’s previous arch-critic, Lord Mandelson, after his change of heart, has weighed in echoing his criticism.) Far from being concerned about a breakdown of trust between governments (Brown’s concern) or, indeed, trust in banking, let alone the culture at Barclays or other banks in the period leading up to the crash, Diamond thinks that the changes made in the last decade have made banks “too risk averse”, that without risk, banks won’t lend, the economy won’t grow.
Both men have a point. But they miss something which has not been much canvassed in the reams of commentary devoted to what happened a decade ago. Regardless of how well risks are understood, regardless of how co-operative governments and regulators are, regardless of how good the rules are, regardless of how many wonderful AI developed risk management systems are used, there will never be a perfect financial system. Or a perfect regulatory system. Problems will always arise. And there will be warning signs – about people, about institutions, about certain types of business. They may not be obvious or easy to read. As the haystack gets bigger, trying to find the needle in it becomes ever harder. Identifying what needs to be followed up and what can be ignored takes skill and experience. Sensing what might become serious and getting people to act before it does so takes persistence. No-one wants to be a Cassandra, endlessly forecasting doom. Even fewer want to listen to her.
Being prepared for the next big meltdown is necessary. But just as much effort – rather more, in fact – needs to be focused on listening to – and acting on – those warning signs, to catching problems (whether mistakes, incompetence or deliberate wrongdoing) early, when they are small, when they can be contained and resolved without too much pain or collateral damage, when they can become learning opportunities for all rather than crises to be managed. Problems, however small, don’t just need fixing then forgetting. They also tell you a story – about the institution, about the people in it, about how business is done. If we are to avoid the inevitable recitation, after every scandal, of the numerous opportunities when the issue might have been identified, acted on and stopped – or mitigated, it is a story which needs to be listened to.
After all, Cassandra turned out to be right.
Humans, Algorithms and Assessing Risk
May 3 2018
Another day. Another IT failure, this time in the NHS where a “computer algorithm failure” meant that for the last 9 years women over 68 were not called for breast cancer screening. Some may have died as a result.
Eminent cancer specialist, Karel Sikora made the point today that this should have been spotted sooner – “Alarm bells should have rung sooner based on a simple observation of the patients who were coming and going. The fact that they didn’t is, I think, indicative of a problem – a blind spot – that exists across the health service.”
And what might that blind spot be? Well, a belief in the infallibility of the technology they were using. “They are no longer as tuned into what they are seeing or what their instinct and experience might be telling them.” A blind spot found in many sectors other than the NHS.
And spare a thought for poor old “instinct and experience” not to mention the evidence in front of your eyes. Too often seen as not possible to measure and, therefore, of no use.
But even that technology titan, Elon Musk, recently acknowledged that the reason for delays with the production of its latest model was an over-reliance on automation (in this case, a naughty flufferbot), adding in a tweet on 13 April: “Humans are underrated.”
Indeed they are. Any effective assessment of risk should never rely on one source only. What you see in front of your eyes is as important as what you see on a screen. Technology is part of the answer, never the whole answer. Experience and judgment also matter.