May 9 2018

According to David Kynaston’s history of the Bank of England (Till Time’s Last Stand), financial crises of one sort or another seem to occur at roughly 10-year intervals and, usually, in the autumn.  Oh dear!  Let’s hope then all the lessons of the last crisis have been properly learnt.  After all, as a famous financial journalist wrote: ““The great wish on the part of the English people as to currency and banking is to be safe.”  Well, quite – and as true now as when it was written – in 1844 by Walter Bagehot.

My take on the last 30 years or so of financial crises and how the sector is responding can be found here in May’s edition of FS Focus.  (My thanks to John Mongelard of the ICAEW and Chris Evans, the editor, and his team.)


The article can also be found here (on page 32).

Photo by rawpixel on Unsplash

Humans, Algorithms and Assessing Risk

May 3 2018

Another day.  Another IT failure, this time in the NHS where a “computer algorithm failure” meant that for the last 9 years women over 68 were not called for breast cancer screening.  Some may have died as a result.

Eminent cancer specialist, Karel Sikora made the point today that this should have been spotted sooner – “Alarm bells should have rung sooner based on a simple observation of the patients who were coming and going.  The fact that they didn’t is, I think, indicative of a problem – a blind spot – that exists across the health service.”

And what might that blind spot be?  Well, a belief in the infallibility of the technology they were using.  “They are no longer as tuned into what they are seeing or what their instinct and experience might be telling them.”   A blind spot found in many sectors other than the NHS.

And spare a thought for poor old “instinct and experience” not to mention the evidence in front of your eyes.  Too often seen as not possible to measure and, therefore, of no use.

But even that technology titan, Elon Musk, recently acknowledged that the reason for delays with the production of its latest model was an over-reliance on automation (in this case, a naughty flufferbot), adding in a tweet on 13 April: “Humans are underrated.”

Indeed they are.  Any effective assessment of risk should never rely on one source only.  What you see in front of your eyes is as important as what you see on a screen. Technology is part of the answer, never the whole answer.  Experience and judgment also matter.


Photo by 수안 최 on Unsplash

Compare and Contrast

April 20 2018

Today’s announcement by the FCA and PRA about the draft warning notice to Barclays boss, Jess Staley, seems to bring to an end (assuming the level of the fine is not contested, not necessarily a given) the lengthy investigation into his conduct when he twice sought to uncover the identity of an anonymous whistleblower, contrary to the rules, good practice and, one hopes, Barclays’ own internal procedures.

It’s noteworthy that the FCA and PRA are saying that Mr Staley had not acted with sufficient care when he set out to uncover the identity of the whistleblower.  Unfortunate wording since it implies that it was his ineptness in doing so that they are criticising rather than the fact that he did not realise that it was quite improper for him to try at all, let alone twice.  But the regulators have shied away from any suggestion of impropriety since calling into question Mr Staley’s fitness and propriety would have called into question whether he should or could continue in his role at all.

Always tricky to get the tone right when criticising those at the top.  So we must wait for the final public notice to see the basis on which the regulators have come to their decision.

Elsewhere Sir Alan Parker has resigned his position at Save The Children two weeks after the Charity Commission announced an inquiry into how the charity had handled sexual harassment claims dating back to 2012 and involving senior executives, including the former CEO and policy director.  As with other similar claims, it is not so much the initial allegations themselves (bad as they may be) which have led to grief but the way they were initially investigated and how those raising concerns were treated.  Better procedures and more training will undoubtedly be required but, even more importantly, what is really needed is an understanding that ignoring the messenger is always the wrong thing to do (no matter how mixed their motivation may be), a lesson others in public life might also usefully learn. As is hoping that the problems will go away if ignored.  They won’t.

Still, if banking has not yet got it right, it is at least better than the NHS, despite the recommendations of the Francis Report following the events at Stafford Hospital, as this programme describes.  Worth listening to just to hear a lawyer who had dealt with both NHS whistleblowers and ones working in a bank say, with more than a touch of incredulity in her voice, that a bank  – “a bank!” – had got it right about how to treat a whistleblower and investigate their claims.  Compliments must be taken where they can.


Photo by James McGill on Unsplash