Bone Marrow Transplantation | 2021

The impact of cult behavior on haematopoietic cell transplant practices: believers and non-believers

 
 

Abstract


Sigmund Freud, writing in 1928 in The Future of an Illusion, distinguished between errors, delusions and illusions [1]. Simply put, errors are mistakes not driven by wishes, delusions, errors all of which are wish-driven and illusions, beliefs some of which are true, some false but all which are wish-driven. Transplant experts, like all other humans, are subject to errors, delusions and illusions. When a group of people share a delusion, psychologists term it a cult. Cults are as old as mankind. In 1841 Charles MacKay, a Scottish journalist published a three-volume treatise entitled: Extraordinary Popular Delusions and the Madness of Crowds, a study of crowd psychology [2]. Volume 1, National Delusions, dealt with economic bubbles and financial delusions such as the Dutch tulip mania of the early seventeenth century. Volume 2, Peculiar Follies, dealt with mass delusions such as the Crusades and witch trials in the Middle Ages. Volume 3, Philosophical Delusions, dealt with practices such as alchemy (typically efforts to turn base metals into gold). MacKay noted many alchemists and their sponsors were themselves deluded, convinced this was possible. Perhaps the most fateful example follower of this delusion was King George III who, facing a huge debt from fighting the Seven Years’ War (French and Indian War), hired alchemists to work in the basement of Buckingham Palace producing gold. Unfortunately, The King is thought to have had porphyria and that arsenic, commonly used in alchemy, precipitated attacks resulting in the madness of King George. (He was said to have mistaken an oak tree for the King of Prussia). (King George was said to have had blue rather than red urine and may have had familial benign hypercalcemia or have been treated with gentian violet. Alternatively, he may have simply been mad [3]). During these psychotic episodes, the King imposed the Stamp Act ultimately resulting in American independence and proving not all delusions end unhappily (at least for Americans). The Oxford English Dictionary offers several definitions of a cult. The most relevant to this typescript is: something popular, fashionable or delusional, especially among a particular section of society. Most of us belong to groups psychologists and sociologists might reasonably describe as a cult. Often this involves believing something seemingly incredulous and/or unproveable such as God visiting ten plagues on the Egyptians or Jesus being the Son of God. Millions or billions of people believe these stories absent convincing proof. Important examples of cult-like delusions among scientists are beliefs peoples act rationally or that everything happens for a reason. We discussed the fallacy people act rationally recently in the Journal [4]. The delusion everything happens for a reason denies the important role of chance in biology (reviewed in ref. [5]). Why are delusions so powerful and persistent? William Bernstein, writing in The Delusions of Crowds, commented: Humans understand the world by narratives. However much we flatter ourselves about our individual rationality, a good story, no matter how analytically deficient, lingers in the mind, resonates emotionally, and persuades more than the most dispositive facts or data [6]. Opinions of crowds, when not delusional, can be useful. In another Journal article we discussed the wisdom of crowds citing the famous experiment of Sir Francis Galton (cousin of Charles Darwin, inventor of the correlation coefficient and of regression to the mean but sadly, father of modern eugenics). About 150 years ago at an English county fair, people in a crowd were challenged to guess the weight of an ox, the winner taking home the butchered animal. Galton thought average people would be terrible in correctly estimating the ox’s weight compared with butchers (experts). He collected the estimates from both cohorts, many ordinary people and a few butchers. The publics’ guess was much closer than the butchers’. The experiment, this time with Penelope the cow, was recently repeated online (https://www.npr. org/sections/money/2015/08/07/429720443/17-205-peopleguessed-the-weight-of-a-cow-heres-how-they-did) with the same result, namely the averaged opinion of many non-experts is often more accurate than the opinion of the so-called experts. In another Journal article Barosi and RPG discuss limitations of consensus statements and clinical practice guidelines developed by a few experts [7]. Often people simultaneously hold obviously contradictory beliefs. For example, one may occasionally encounter a physician who believes sickness is an illusion curable by prayer while treating a sufferer with modern drugs which hey believes will improve the condition. Processes by which people manage these discordances are termed compartmentalization and cognitive dissonance by psychologists [8]. The latter is an unpleasant feeling arising when one’s belief is confounded by clearly contradictory data. These remarkably adaptive processes allow people to maintain their delusions. Less this seem too far afield, consider how often a physician helping a patient decide about a potentially dangerous intervention will predict an outcome while at the same time acknowledging substantial inaccuracy and imprecision (in the form of wide confidence intervals around point-estimates) in his or her publications on the same intervention. By now the reader must be wondering what delusions and cult behavior have to do with haematopoietic cell transplants. We hasten to explain. Sometimes an idea takes hold in the minds of scientists which makes such sense that it is difficult to refute despite considerable data proving the idea incorrect. Recently, we discussed to cult of

Volume None
Pages 1 - 3
DOI 10.1038/s41409-021-01473-w
Language English
Journal Bone Marrow Transplantation

Full Text