News

Setting the right example?

May 11 2018

Well, the first FCA/PRA enforcement decision against a senior manager – a CEO, no less (Barclays’ CEO, Jes Staley) – is out and can be read here.  Mr Staley is fined – a total of £642,430 – and Barclays has also announced that his bonus has been reduced by £500,000.  All very aggravating for Mr Staley, no doubt.  And Barclays faces continuing review of its whistleblowing framework and processes.

But there are some interesting features to the FCA’s reasoning which warrant a closer look:

(1) Much is made of the conflict of interest which Mr Staley had in relation to the first anonymous letter which was sent and why it was, therefore, wrong of him to get involved in decisions about whether the letter’s allegations should be classified as a whistleblowing and investigated.  All very true.  But a CEO – any senior manager, indeed, any manager at all – should not get involved at all in making such a decision (let alone be involved in the investigation) regardless of whether they have an actual or potential conflict of interest.  The decision about whether something is or is not a whistleblowing allegation should be made by the team in charge of whistleblowing.  No-one else.   The FCA’s focus on the conflict of interest point risks creating the impression that it may be OK for senior managers to be involved in deciding this when they face no actual or apparent conflict of interest.  Will this really give potential whistleblowers the reassurance they need?

(2) The level of the fine was not made any higher because Mr Staley was deemed to have acted negligently.  According to the facts set out in the Final Notice, Mr Staley appears to have taken a number of deliberate steps for reasons which made sense to him at the time.  To describe these as mere negligence might be viewed as generous.

(3) The seriousness of the breach is classified as a Level 2 (out of 5) breach, partly because of the negligence and partly because there was no profit made or loss avoided and there was little or no loss or risk of loss to consumers, investors and the market generally.  But these latter two tests are largely irrelevant in the case of something as important as whistleblowing.  The risk of his actions was that it sent out a message to anyone concerned about misconduct at the bank that, whatever the procedures said, senior management’s instinctive reaction was to try and shoot the messenger and/or dismiss the allegations.  It sent out a message that the bank – at the highest levels – did not appear to value the integrity or independence of the whistleblowing investigative process.  How could anyone wishing to raise concerns feel confident that their allegations would be taken seriously, investigated properly and treated confidentially?  And if this could happen at this bank, how could anyone be confident that it would not be the same at other financial institutions?

This was an opportunity to send out a very clear signal to the whole market about the importance of whistleblowing: not just the existence of procedures but the reality in practice of a function where the whistleblowing team is independent, in charge, trusted, not undermined or second-guessed by management and has the necessary skills, experience and resources to investigate allegations properly.  And, critically, that senior managers need to live by the rules applicable to others not simply espouse them.

The acid test for whether a culture has really changed for the better is whether those at the top are treated in the same way as those at the bottom when they misbehave.  Let’s hope that this is the lesson the sector learns from this decision.

And, finally, a plea: misbehaviour / breaches of rules are not “inappropriate” (a word best used for social or grammatical solecisms) but wrong“.  It would be nice if the regulators were to say so.

WHAT GOES AROUND……

May 9 2018

According to David Kynaston’s history of the Bank of England (Till Time’s Last Stand), financial crises of one sort or another seem to occur at roughly 10-year intervals and, usually, in the autumn.  Oh dear!  Let’s hope then all the lessons of the last crisis have been properly learnt.  After all, as a famous financial journalist wrote: ““The great wish on the part of the English people as to currency and banking is to be safe.”  Well, quite – and as true now as when it was written – in 1844 by Walter Bagehot.

My take on the last 30 years or so of financial crises and how the sector is responding can be found here in May’s edition of FS Focus.  (My thanks to John Mongelard of the ICAEW and Chris Evans, the editor, and his team.)

 

The article can also be found here (on page 32).

Photo by rawpixel on Unsplash

Humans, Algorithms and Assessing Risk

May 3 2018

Another day.  Another IT failure, this time in the NHS where a “computer algorithm failure” meant that for the last 9 years women over 68 were not called for breast cancer screening.  Some may have died as a result.

Eminent cancer specialist, Karel Sikora made the point today that this should have been spotted sooner – “Alarm bells should have rung sooner based on a simple observation of the patients who were coming and going.  The fact that they didn’t is, I think, indicative of a problem – a blind spot – that exists across the health service.”

And what might that blind spot be?  Well, a belief in the infallibility of the technology they were using.  “They are no longer as tuned into what they are seeing or what their instinct and experience might be telling them.”   A blind spot found in many sectors other than the NHS.

And spare a thought for poor old “instinct and experience” not to mention the evidence in front of your eyes.  Too often seen as not possible to measure and, therefore, of no use.

But even that technology titan, Elon Musk, recently acknowledged that the reason for delays with the production of its latest model was an over-reliance on automation (in this case, a naughty flufferbot), adding in a tweet on 13 April: “Humans are underrated.”

Indeed they are.  Any effective assessment of risk should never rely on one source only.  What you see in front of your eyes is as important as what you see on a screen. Technology is part of the answer, never the whole answer.  Experience and judgment also matter.

 

Photo by 수안 최 on Unsplash