How to spot bull***t: A guide for managers

13 April 2017 -

“Training"

Rubbish, nonsense, and tosh. Casuistry, obscurantism, and sophistry. B******ks, bull***t and humbug

Guest blogger Adrian Furnham

Ten years ago an Academic moral philosopher wrote a book called On Bull***t, which won a number of book awards. For him, bull****ting was different from lying. A bull****ter is a publicist whose principal aim, through different media, is to impress others usually with profound, counter-intuitive or surprising ideas or facts. The motive is usually to make money and or gain followers.

Social media presents a golden opportunity to “spread their wares”. Gurus, pundits and self-proclaimed geniuses can easily reach hundreds of thousands of people by spreading the word.

As a manager, it is vitally important to be able to spot the bull***t. Whether it be someone exaggerating their prowess during an interview or a potential trading partner over-egging the benefits of a deal, bulls***t can have dire consequences when left undetected.

But how good are people at “bull***t detection”? Just as most of us are “victims” (or is it “beneficiaries”?) of the Lake Wobegon effect (a real and pervasive human tendency to overestimate one's achievements and capabilities in relation to others), so we believe we can relatively easily spot the difference between fact and faction, truth and tosh, reality and rubbish.

Last year a group of Canadian academics wrote a paper in a distinguished academic journal (Judgement and Decision Making, 10, 549-563) entitled: “On the reception and detection of pseudo-profound bull***t”. Their question was what sort of people believe in bull***t and what sort is most often detected vs accepted.

They gave people a range of “seeming impressive assertions, presented as true and meaningful but are actually vacuous”. This was the pseudo-profound bull***t. And they had no difficulty in finding them – for example, “Imagination is inside exponential space time events”. They found the most useful New Age Bull***t Generator to come up with their scale. This they gave to students along with intelligence tests, tests of thinking style and reasoning, tests of ontological confusion and religious beliefs (after-life, angels, heaven, hell, miracles, Satan, soul etc).

In their first study, the academics found, as predicted, that brighter analytic thinkers were more likely to detect the bull***t while the more confused and religious were less good detectors. They replicated their findings in a second study adding other tests and showing that the more faith people had in intuition and the more they accepted paranormal beliefs, the less good they were at bull***t detection.

In a third study, they showed that those who thought mundane statements profound (“Most people enjoy some sort of music”) were also poor bull***t detectors. The fourth study looked at those who believed in Complementary Medicine as well as general conspiracy theories. And yes, as hypothesised, those who were fans of alternative and complementary medicine and prone to conspiracy theorists were also poor detectors.

Is all this essentially stating the “bleeding obvious”?

Brighter people and those who are prone to critical, analytic, thinking are less willing to accept all sorts of claims, be they about medicine (complementary), politics (conspiracy) or religion. They are, in short, better detectors of pseudo-profound bull***t, peddled by a range of gurus. To a large extent all the academic disciplines teach critical thinking, from literary criticism to evidence-based medicine.

But are people getting better at bull***t detection and why should it matter? Psychologists have long been interested in a phenomenon called the Barnum Effect named after a notorious 19th-century hoaxer and master of spin.

The effect is “sort of” bull***t detection. It occurs when people accept personality feedback about themselves because it is derived from a valid psychometric test. People fall victim to the fallacy of personal validation. People accept the generalisations that are true of nearly everybody to be specifically true of themselves.

More than 60 years ago, a psychologist called Stagner gave a group of  managers a personality test. He gave each of them bogus feedback in the form of statements derived from horoscopes, graphological analyses and so on. Over half felt their profile was an accurate description of them, and almost none believed it to be wrong.

Another Professor called Forer gave personality tests to his students, ignored their answers, and gave each student an identical evaluation. The first three items were:

  • “You have a great need for other people to like and admire you”
  • “You have a tendency to be critical of yourself”
  • “You have a great deal of unused capacity that you have not turned to your advantage”.

His explanation for the Barnum effect was in terms of human gullibility. People tend to accept claims about themselves in proportion to their desire that the claims be true rather than in proportion to the empirical accuracy of the claims as measured by some non-subjective standard.

This confirms another principle in personality assessment – the ‘Polyanna principle’ – which suggests that there is a general tendency to use or accept positive words or feedback more frequently than negative words of feedback.

Managers must be wary, therefore, not to take everything at face value and make sure they do not fall into this trap of natural human gullibility.

Sometimes if it sounds too good to be true, maybe it’s just bull***t.

Powered by Professional Manager