Why mind viruses are real
Yes, comparing other people’s beliefs to viruses can be cheap and gratuitous, but the idea of “mind viruses” has real scientific merit.
Are there any such things as mind viruses? And if you were infected by one, how could you tell? Viruses of the mind seem to be all the rage these days. In his 2020 bestseller Parasitic Mind, evolutionary psychologist Gad Saad identifies “the tyranny of political correctness” and other “infectious ideas” that are harming our societies. A year later, from a different ideological angle, the philosopher
published Mental Immunity, a guide to boosting your mental immune system against infectious “mind parasites.” And in his popular science book Foolproof (2022), psychologist Sander Van der Linden advocates “mental inoculation” against misinformation, fake news, and conspiracy theories. In various ways, each of these books suggests that beliefs act like infectious parasites spreading from one brain to the next.Perhaps because of the devastation wrought by the coronavirus, we have all become more attuned to the possibility of tiny pathogens invading our bodies and propagating at our expense. In fact, while we were still in the throes of the pandemic, the WHO itself coined the portmanteau term “infodemic” to refer to outbreaks of disinformation and conspiracy theories about the coronavirus. But the idea itself is much older than that. More than thirty years ago, the biologist
wrote about “viruses of the mind” (focusing mostly on religion), in line with his earlier idea of selfish “memes,” a phrase coined in his 1976 book The Selfish Gene. And as far back as the late 19th century, sociologists like Gabriel Tarde and Gustave Le Bon argued that cultural ideas and behaviours spread through society much like infectious diseases, through imitation and repetition.Still, many academics remain dubious. The whole register of disease metaphors (virus, infection, parasite, contagion), they argue, is tendentious and misleading. At best, it just redescribes something we have known all along: namely, that cultural ideas are transmitted from one person to the next. More worryingly, the metaphor suggests that humans are just hapless, gullible victims of whatever infectious ideas they come across. This is not just wrong, they argue, but can also lead to social panics. At worst, by pathologising beliefs, we can end up demonising those who hold them.
There is definitely something to these worries. Both the Left and Right seem to be equally enamoured with talk about mind viruses, but of course, each side tends to diagnose them among the opposite camp. There is a universal temptation to resort to cheap medical or psychiatric labels for ideas we dislike, which we should resist. To give just one example: during the 2016 US election, each of the two major American political tribes branded the other as suffering from a psychiatric condition: “Trump Derangement Syndrome” and “Clinton Derangement Syndrome,” respectively. A concept like “mind virus” can be weaponised in similar ways. Elon Musk’s recent vow to “destroy the woke mind virus” that “infected” his estranged daughter arguably falls into that category.
Still, the idea of viruses of the mind is hard to eradicate, no matter what ideological side you’re on—and we believe this is for good reason. Beliefs can be seen as viruses in three different senses, each of which builds on the others. The richest and most fertile meaning, as we’ll see, is the third: the Darwinian notion that some beliefs have evolved through largely blind selection, independent of our human intentions, to further their own propagation. And recent research (full disclosure: some of which done by us) shows that we should take that notion seriously. But let’s start with the first one.
[Read the rest of our piece at Quillette or by becoming a paid subscriber here]
Some beliefs spread like viruses.
Beliefs and other cultural ideas spread from one person to the next, forming chains of transmission similar to those of infectious diseases. This is the ordinary sense of the internet idiom of “going viral,” and it doesn’t imply anything unseemly. It is just one among a number of metaphors used to talk about the spread of culture (others include wildfires, avalanches, ripples on a pond, falling domino tiles, cascades, and waves).
In fact, going viral is the highest calling of many creators on social media, and the phrase is often used as a compliment. Distracted boyfriend memes, LOLcats, and parodies of Hitler’s Downfall rant go viral on TikTok or Twitter/X because their human consumers find them irresistibly funny, clever, cute, and original. There’s nothing wrong with drawing analogies between epidemiology and culture to help us understand the spread of ideas, as mathematician Adam Kucharski does in his fascinating book The Rules of Contagion: Why Things Spread—And Why They Stop (published in the pandemic year 2020). Kucharski isn’t interested in discrediting contagious ideas and behaviours; he just wants to understand how they take off, spread, and fizzle out, by borrowing concepts like “reproduction number” from epidemiology. Such analogies don’t necessarily suggest that people just mindlessly copy whatever ideas they come across. Ask any online content creator who dreams of going viral: it is very difficult to persuade millions of people to click the ‘share’ or ‘like’ button.
In fact, because people generally care about evidence and facts, often nothing is more contagious than the truth. If there were a devastating tsunami in the Philippines right now, within hours or even minutes true beliefs about such an event would have “infected” hundreds of millions of minds, spreading at a rate of which even the most effective peddler of disinformation can only dream. This is one reason why the often-heard claim that falsehood spreads faster than truth is highly misleading. As Douglas Adams once wrote: “Nothing travels faster than the speed of light with the possible exception of bad news, which obeys its own special laws.”
Some cultural beliefs are pernicious.
A second and richer sense of “mind virus” refers to the subset of cultural beliefs and ideas that are harmful. This is the sense of the WHO’s “infodemic” and the one used throughout Van der Linden’s Foolproof. Like other social psychologists, Van der Linden is mostly interested in the spread of fake news, science denialism, disinformation, and baseless conspiracy theories.
There is nothing wrong per se with using epidemiology as a source of analogies for our misinformation problem. In other contexts, beliefs are also regularly compared to poisons (toxic, venomous), cancers (malignant, metastasising), pollution (noxious, contaminating), or acids (corrosive, destructive). All these metaphors have somewhat different emphases. If you call a belief a virus, you draw attention to social influences and patterns of transmission; if you call it a poison, you highlight its insidious and slow-acting effects. It would be impossible to purge our public discourse—note the pollution metaphor here—from such imagery, as our language is completely permeated by metaphors. In a recent critique of “mind virus” talk, which makes some excellent points, philosopher
ironically resorts to poison imagery multiple times to describe the pernicious effects of the virus metaphor. Does that imply that, by his own lights, he’s “demonising” people by comparing their beliefs to cyanide or arsenic? Not at all. Williams is just driving home his point that some metaphors, though they look innocent enough, are slowly but surely befouling our public discourse. We disagree, but that’s fine, as we both understand that metaphors are not literally poisons.It’s important not to conflate the different meanings of the term “mind virus.” Just because something spreads like a virus doesn’t mean that it causes harm like a virus. Often, nothing is more contagious than true facts or acts of kindness.
Some beliefs are evolved parasites.
Now we come to the richest and most fascinating sense of “mind viruses”: that some bad beliefs are designed by blind cultural selection to further their own propagation, rather than to serve the interests of their “hosts.”
Richard Dawkins was among the first to raise the possibility that some memes—bits of cultural information—can act in “selfish” ways similar to those of genes: those that are most successful at spawning copies of themselves, whatever that takes, will make it to the next generation. The late philosopher Dan Dennett and the psychologist Susan Blackmore are among the most avid proponents (superspreaders?) of this intriguing idea, applying it to religions, earworms, addiction, cultural innovation, and even consciousness.
Because Dawkins coined the word “meme” as the cultural counterpart of “gene,” the notion of a “meme” has unfortunately become wedded to physically discrete units and straightforward copying. But you don’t need to make this assumption to understand Dawkins’ most startling suggestion: that some ideas spread without benefiting their hosts—and may even kill those hosts, as long as that encourages their propagation.
Most of our beliefs, of course, don’t actively harm us, but then again neither do most biological viruses (some even benefit us). A virus just invades our bodies and hijacks our replication machinery to make more copies of itself. But it “doesn’t mind” harming us if that’s what it takes to end up in the next generation. Microorganisms like viruses or bacteria are called parasitical (as opposed to mutualistic or commensal) if they actively harm their hosts, and the same distinction can be extended to the realm of culture. Much like biological microbes, most of our beliefs will never harm us because they have “interests” that align perfectly with our own: they spread because they help us survive, make societies flourish, or because we find them clever, funny, titillating, or attractive (as with most stuff that goes viral on the Internet). But there are exceptions: some belief systems have, over time, evolved adaptations that do not just serve their own interests, but also subvert ours.
European Witch Hunts
One of the most spectacular examples of a destructive mind virus is the belief system surrounding witchcraft in early modern Europe. Beliefs in harmful magic and witchcraft had existed for centuries in many different cultures. But the system of beliefs related to witchcraft that emerged in Europe in the 15th and 16th centuries was specific and elaborate. Not only did this “memeplex” lead to large outbreaks of witch hunts—up to 50–60,000 people were killed as alleged witches—but many elements of the witch hunts of that time almost seem intelligently designed to maximise the virulence of those beliefs and to spread pernicious associated practices: the use of torture against suspects incentivised false confessions; the belief that witches met other witches at so-called “witches’ sabbaths” created chains of accusations; and the belief that witches could fly long distances on broomsticks to attend these sabbaths facilitated the spread of these trials over large geographical areas. When witch-hunting started in one place, it easily “infected” other villages and towns.
Once accusations started flying around, people saw fresh confirmations everywhere. When the alleged witches were palpably upset by the accusations, that was considered suspicious. But not responding emotionally enough was equally suspicious. If torture made suspects confess quickly, this counted as demonstration of guilt. But if suspects withstood torture for months, this was also confirmation of guilt: such endurance could only result from demonic assistance.
What is most intriguing is that contemporary critics of witch-hunting were so impressed by the clever design of the belief system that they believed that there was an evil genius behind it. In their view, the Devil himself had secretly orchestrated the craze for witch-hunting—creating illusions of witchcraft in order to cause large-scale human suffering and ensure the damnation of the witch-hunters’ souls. In a similar vein, modern historians have long presumed that the idea of widespread witchcraft must have been cleverly invented by some interested agent—such as the Catholic church, an economic elite, or the patriarchy—to oppress women or the common people.
The problem is that those historians have never succeeded in identifying the inventors of any such master plan. Most people involved in the trials seem to have been genuinely terrified of witches. The witch hunts spread without much coordination, were initiated by people from a wide range of social groups, and often wrought terrifying havoc—in fact, in certain cases, accusations turned against those who had initiated the persecutions. When the trials developed a dynamic of their own, the accusers and persecutors could find their communities ravaged and their loved-ones dead, and sometimes ended up at the stake themselves. It is unclear just who benefited from all this. But what if the witchcraft meme that spread in the 16th and 17th centuries was designed by blind selection to enhance its own propagation? We believe this is the most plausible explanation.
All sorts of beliefs about witchcraft have appeared at various points in history and in various places—but most of these ideas were not well adapted for survival within their contexts. To give one example: an idea that popped up in a German news sheet from 1555 claimed that witches were saved from the stake by the Devil just before they were burned—a sensational idea, but it made witch-hunting pointless and thus failed to trigger any witch trials—unlike ideas such as the witches’ sabbath and their nightly flights, which survived by making witch-hunting more virulent.
Modern Mind Viruses
Few bad belief systems today are as deadly as the European witch-hunts, but many have similar design features, which make them extremely resilient against counterevidence and facilitate their dissemination. Many religious cults ruthlessly punish apostates and dissenters, and either make scepticism taboo or neutralise it with theory-internal explanations (for instance, that it is the Devil that makes people disbelieve in God). Their adherents also vigorously compete with rival faiths, or as Dawkins writes, using appropriately virological vocabulary: “The sufferer may find himself behaving intolerantly towards vectors of rival faiths, in extreme cases even killing them or advocating their deaths.”
Religions like Christianity and Islam have made proselytising into a sacred duty, and they have a range of unfalsifiable and conspiratorial explanations for any evidence that seems to contradict their belief systems (see the theological literature on the “problem of evil,” or rationalisations of why prayer often fails). It is possible that all these design features were created by clever priests and prophets to protect the faith, but it is equally plausible that they slowly accrued over time, going through a number of variations and mutations, until believers stumbled upon useful gambits that insulate their beliefs from criticism. By adopting those beliefs that are resilient, infectious, and impervious to refutation, believers are unwittingly triggering an evolutionary dynamic that is beyond their conscious control. The more virulent systems flourish; the more vulnerable ones wither away. In the most spectacular cases, such as jihadist beliefs about martyrdom and rewards in the afterlife, a viral belief may literally kill its hosts while broadcasting itself.
In our modern age, many unsubstantiated conspiracy theories (about 9/11 or the Moon landing, for instance) are also prime examples of belief systems with a self-sealing logic that is impervious to criticism. If there is no direct evidence to support the conspiracy, that is because the culprits have carefully covered up their tracks. If any evidence seems to contradict the conspiracy, it must have been planted by the evil plotters to throw us truth-seekers off the scent. And if anyone challenges the conspiracy, they must be a stooge in the pocket of the powers-that-be. This makes conspiracy theories into the intellectual equivalent of a black hole: easy to fall into, hard to escape from.
And what about the “woke mind virus”? It is striking that Critical Race Theory, at least as expressed by Robin DiAngelo and others, follows the same circular logic as witch hunts and conspiracy theories, as multiple people have observed. If you’re white and you disagree with the tenets of Critical Race Theory, as Julian Adorney puts it here in Quillette, you’re just evincing “white fragility,” which is simply “proof of the validity of said accusation”; and if you dispute it as a member of a racial minority, then you’re suffering from “internalised oppression.” This heads-I-win-tails-you-lose logic is a clever gambit that has been discovered multiple times, in an example of convergent evolution. The Critical Theory catch-22 is analogous to Sigmund Freud’s notion of “unconscious resistance” against psychoanalysis, and the Marxist notion of “false consciousness.”
Hapless Victims?
Calling a belief system a “mind virus,” many critics complain, portrays adherents as gullible and irrational, or at least as passive victims. But if anything, the opposite is true. The belief system instigating the European witch hunts evolved clever designs to adapt to its psychological and cultural environments. Many so-called “irrational” belief systems have remarkable internal coherence and are equipped with ample resources to deflect counterattacks. That’s hardly surprising. Evolving in the environment of human minds that care about truth and rationality, weird belief systems have to display a convincing illusion of reasonableness, a good simulacrum that will deceive people.
Our point is not to pathologise these beliefs as irrational or stupid, but precisely the opposite. We want to understand why smart people end up believing weird things. If an ordinary virus like the flu can trick your immune system, that doesn’t mean your immune system is stupid. In fact, it is an extremely sophisticated network of interlocking detection and defence mechanisms, but it is embroiled in a relentless evolutionary arms race against antagonists who aren’t dunces either. Likewise, people are not merely hapless victims of mind viruses. Most reason their way into such beliefs, and if we are to believe the recent study on the use of AI dialogues to dissuade conspiracy theorists, they can be reasoned out of them again.
Critics like
are right that we should take people’s beliefs seriously, not pathologise them. If anything, however, we think we take believers more seriously than he does. Williams often emphasises the role of social incentives and status games in “belief community,” whereas we tend to take believers at their words. If someone says he believes in witches, and acts as if he believes in witches, and is even prepared to burn his neighbour at the stake for being a witch, then that person probably believes in witches. As the historian Julian Goodare puts it, “a witchcraft accusation was not ‘really’ about something else; it was really about witchcraft.”The Limits of Analogies
No analogy is perfect, and reasonable people can disagree about how useful it is to apply epidemiological metaphors to the realm of ideas. Sander Van der Linden has become famous for his hypothesis that we can “inoculate” people against harmful disinformation by exposing them to a weakened dose of the pathogen.
, in his fascinating book Mental Immunity, defends the view that the human mind is equipped with a full-fledged 'immune system' and that there are intellectual equivalents to antibodies, immune responses, and auto-immune disease. Norman founded The Cognitive Immunology Research Center (), which is dedicated to “developing humanity’s immunity” to infodemics, mind viruses, and divisive ideologies.The scientific debate about these ideas is still raging, and only time will tell if they lead to fruitful insights. But in any case, we believe that, in many ways, ideas do indeed spread like viruses, and that some of them may have evolved interests of their own. Daniel Dennett’s tongue was only partly planted in his cheek when he quipped that “a scholar is a library’s way of making another library.” People from all ideological camps should be open-minded about the possibility of harmful viruses of the mind—including those that infect people on their own side. If cultural ideas can evolve into harmful parasites, unbeknownst to us and contrary to our intentions, it’s high time we became aware of this.
[Published at Quillette with Steije Hofhuis, Nov 11th, 2024)