Last year Robert Bowers shot up a synagogue in Pittsburgh, killing eleven people. Before committing this atrocity he wrote on Gab: “HIAS [Hebrew Immigrant Aid Society] likes to bring invaders in that kill our people. I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.”
Gab is a Twitter alternative used by many neo-Nazis and alt-righters who have been (or know they would be) banned from actual Twitter. The unintended—but entirely predictable—consequence of throwing extremists off Twitter has been to create a large community of exiles on Gab. In Gabland, it is people who question Jewish conspiracy theories or the idea that the US should be a white ethnostate who are considered “trolls.” A similar community is developing on the YouTube alternative BitChute, whose Alexa ranking is rising quickly.
Bowers’s threat of imminent violence (“Screw your optics, I’m going in”) didn’t alarm any of his fellow extremists on Gab. What if he had written the same thing on Twitter? Someone would have been much more likely to contact the police. Perhaps at that point there wouldn’t have been enough time to stop him anyway. But if he had been on Twitter, it’s possible that someone would have reported him to the police long before the shooting for some ominous statements he had made in the past. In any case, relegating Bowers to a non-mainstream platform didn’t stop him from committing the deadliest attack on Jews in US history.
In the last few weeks, the leading social media companies have doubled down on their strategy of deplatforming people and censoring content. Alt-right accounts are disappearing from Twitter, videos on controversial topics are being deleted from YouTube, and even some politically moderate YouTube streamers/content creators who didn’t violate the terms of service are being demonetized in an effort to drive them away. But deplatforming won’t work.
This claim needs clarification. Whether something “works” or not depends on what you’re trying to accomplish. If Twitter/YouTube/Facebook want to virtue signal by showing that they oppose controversial views (which could well be their true aim), then deplatforming controversial people will work. What I mean is that it won’t accomplish the noble goals that these companies say are motivating them: to prevent violence and the spread of socially destructive misinformation. If these are their goals then deplatforming will backfire—and already has backfired.
Advocates of deplatforming tend to think only one step ahead: Throw people with opinions you don’t like off mainstream social media and you won’t see them again—out of sight, out of mind. But the deplatformers should try thinking two, maybe even three, steps ahead: What will people do after they’re banned? How will their followers react? How will this be perceived by more or less neutral observers? With some forethought, it’s easy to see that banning people with supposedly “bad” or “wrong” views may not be the victory that deplatformers think it is.
Banning people from social media doesn’t make them change their minds. In fact, it makes them less likely to change their minds. It makes them more alienated from mainstream society, and, as noted, it drives them to create alternative communities where the views that got them banned are only reinforced.
Banning people for expressing controversial ideas also denies them the opportunity to be challenged. People with extremist or non-mainstream opinions are often written off as deranged monsters who could not possibly respond to rational argument. There are, of course, some neo-Nazis, Holocaust deniers, and the like, who conform to this cartoonish stereotype. With these people, reason and evidence go in one ear and come out the other. But not everyone outside the mainstream, and not everyone who falls for a misguided conspiracy theory, deserves to be written off. People do sometimes change their minds in response to reason. If they didn’t there would be no point in debating anything.
Kevin MacDonald, a former California State University, Long Beach, psychology professor and current alt-right thought leader, argues that Judaism is a “group evolutionary strategy” that leads Jews to undermine gentile society for their own benefit. Last year I published the first academic critique of MacDonald. My paper received a fair amount of attention on social media (by the standards of academic papers). Some people completely changed their minds about MacDonald, some moderated their views, many alt-righters called me names, and a few crazies threatened me. I couldn’t have hoped for more than this. Most people aren’t going to immediately give up their deeply-held beliefs in response to a paper. But I think I at least made a large number of MacDonald followers aware that there is an argument on the other side. And I was happy for people to challenge my own conclusions on this subject.
Many of my critics have recently been banned from Twitter. Threads about my paper are filled with long strings of “This tweet is unavailable.” All of these banned people—even if they were arguing in bad faith—were still engaging with the other side. Banning them means limiting their ability to even become aware of alternative views and arguments. Now they’re on Gab and 4chan egging each other on to become more and more extreme.
One rationale for deplatforming controversial people to prevent them from negatively influencing others; to stop the “corruption of youth.” It’s true that when you ban someone, or take down a YouTube video, you make it more difficult to access controversial content, which will stop some people from seeing and potentially being influenced by it. But this strategy can easily backfire for at least three reasons.
Firstly, banning people or censoring content can draw attention to the very person or ideas you’re trying to suppress. After Alex Jones was banned from YouTube, Twitter, and Facebook, there was a huge jump in traffic on the InfoWars website. Now he’s well known for being banned. It’s quite likely that some people have come under his influence because of the ban.
Secondly, even when banning someone reduces his audience, it can, at the same time, strengthen the audience that remains. Despite the initial bump in traffic, as a result of the social media ban, fewer people are regularly watching Alex Jones’s videos. But that doesn’t necessarily mean he’s less influential. His website still gets several hundred thousand visits every day. The people who stuck with him—or started listening to him because of the attention brought to him by the ban—probably feel more aggrieved than ever, and they are being pushed further away from mainstream platforms and into alternative communities where they will get less exposure to alternative views.
Thirdly, any kind of censorship can create an aura of conspiracy that makes forbidden ideas attractive. This was the main consequence of the foolish laws against Holocaust denial in several European countries. No one in history ever gave up Holocaust denial because of these laws. On the other hand, quite a few people have concluded that, if it’s necessary to ban discussion of the mainstream narrative, something fishy must be going on. The correct answer to Holocaust denial, or any other wrong idea, is to explain why it’s wrong. Of course some people will fail to be convinced by the evidence. We have to make peace with the fact that, to use T. H. Huxley’s expression, “so long as men are men and society is society” some people will hold and promote crackpot ideas.
An even more fundamental reason why social media companies shouldn’t try to suppress controversial ideas is that they are very bad at determining who and what is wrong or dangerous. There will inevitably be many false positives and false negatives. A major consequence of deplatforming and censorship will simply be to introduce chaos into our political discussions. Nothing good can come of that.
Nathan Cofnas is a philosophy DPhil candidate at the University of Oxford. You can follow him on Twitter @nathancofnas