Last week Professor Alexis Jay issued her report into the scandal of child-sex abuse in Rotherham, in the North of England. It is shocking reading; 1,400 children were sexually abused in the town over 16 years and it is overwhelmingly critical of the police, social workers and town councillors who were all found to have let the victims down. Yet as The Daily Telegraph reported, the website of the town council introduced the report’s findings this way: “Services to protect young people at risk from child sexual exploitation in Rotherham are stronger and better co-ordinated today than ever before, an independent review has found”.
This interpretation by the Council is an extreme case of a phenomenon known as ‘Confirmation bias’, where people tend to favour information that confirms their preconceptions or assumptions whether they are true or not. The report is overwhelmingly critical, yet the Council have chosen to find a tiny piece of metal in the haystack of this report and call it a needle.
The term ‘Confirmation bias’ was coined by the British psychologist Peter Wason following a series of experiments in the 1960’s which demonstrated the phenomenon. Such biases are particularly strong for emotionally significant issues, like child abuse, and for established beliefs.
As a research and analytics company often our job is to present data on organisations’ reputation or brand image. Usually the audience is the same organisation that has commissioned the study; different individuals perhaps from many different departments, but all from the same organisation. The data is factual (not anecdotal evidence or personal opinion) collected in much the same way as political polling is conducted. The data is there to be assessed and tested but due to the phenomenon of ‘Confirmation bias’ it often is not, at least not sufficiently.
Understandably, people find it very difficult to confront the possibility that our beliefs, our way of seeing the world may be wrong. In an organisational context this is compounded. With peers, managers and senior managers in the room it is extremely difficult to see things as they are rather than the way the organisation, as a collective, sees them. In part ‘confirmation bias’ is a defence mechanism; we weigh up the costs of being wrong. At work the cost can be considerable, personally as well as for the organisation. Our tendency to succumb to confirmation bias can lead to very flawed decision making – witness Rotherham Council’s website.
So “hats off” to those of you who see data and information for what it is; who see things as they are and who take the lead and resolve to test, and if necessary to disprove, the accepted organisational wisdom. It is not for the faint hearted. It is extremely challenging, personally and professionally.
But for most of us, helping to build something that is better is so much more satisfying, deep-down than living with the fear of waiting to be found out.