The recent clashes between far-right extremists and police in British towns and cities have once again brought the issue of online propaganda and its real-world consequences to the forefront of public discourse. As we grapple with the spread of racism, violence, and misogyny fuelled by misinformation, it has become increasingly clear that our current approach to combating extremist content is inadequate. Rather than focusing solely on fact-checking and removing hateful messages, we must shift our attention to the audiences being targeted and the algorithms that enable the spread of extremist ideologies.
The unrest that began in Southport, where a group of self-proclaimed "protesters" attacked a mosque in response to the tragic deaths of three young girls during a knife attack, highlights the dangerous consequences of unchecked online propaganda. Despite the attacker's identity being unrelated to the Muslim community, misleading messages quickly spread online, stoking fear and hatred. Even prominent figures like Reform MP Nigel Farage contributed to the confusion by questioning the accuracy of information surrounding the attacker's identity, further fuelling the flames of discord.
The problem with our current approach to combating online extremism lies in its myopic focus on individual messages. Propagandists are adept at exploiting the attention economy, using provocative and emotionally charged content to generate clicks, income, and power. Factual accuracy is secondary to their primary goal of cultivating an audience receptive to their ideology. By simply removing offending messages, we engage in a futile game of whack-a-mole, as new content quickly emerges to fill the void. Moreover, this approach risks playing into the hands of extremists, who can claim victimhood and paint themselves as censored by the establishment.
To effectively combat online extremism, we must shift our focus from the message to the audience. Political scientist Benedict Anderson's concept of "imagined communities" provides a useful framework for understanding how propagandists create and maintain their followings. By crafting a shared narrative complete with foundational myths, symbols, and a distinct worldview, extremists foster a sense of belonging and purpose among their adherents. These imagined communities often position themselves in opposition to the mainstream, casting doubt on established facts and encouraging members to "do their own research."
The work of propagandist theorist Jacques Ellul further illuminates the centrality of myths in successful propaganda. Ellul argues that agitation propaganda, which is designed to mobilize people to action, relies heavily on the cultivation of hatred and the identification of a scapegoat. This is evident in the rhetoric of figures like Andrew Tate, who has built a following by stoking misogyny and positioning himself as a leader in a supposed battle against feminism.
To combat the spread of extremist ideologies, we must identify the informational silos in which these imagined communities reside and work to introduce alternative viewpoints and narratives. Rather than engaging in a futile battle against individual messages, we should focus on targeting the algorithms that create and reinforce these silos. By introducing a diversity of perspectives and challenging the foundational myths that underpin extremist ideologies, we can begin to erode the power of online propaganda.
Fact-checking, while important, is not sufficient on its own. As Marianna Spring, the BBC disinformation and social media correspondent, has documented in her book "Among the Trolls," algorithmic rabbit holes can lead individuals down a path of radicalization, isolating them from alternative viewpoints. By actively engaging with these communities and providing a way out of the echo chamber, we can begin to mitigate the harmful effects of online extremism.
Combating online extremism requires a fundamental shift in our approach. We must recognize that the battle is not solely about individual messages but about the audiences being targeted and the algorithms that enable the spread of extremist ideologies. By identifying informational silos, introducing alternative narratives, and challenging the foundational myths that underpin extremist worldviews, we can begin to lead the lost back from the brink. It is a daunting task, but one that is essential for the health of our democracy and the well-being of our society.
William Gomes, a British-Bangladeshi anti-racism campaigner, advocate for the rights of displaced people, and a contributor to various publications. He can be reached at [email protected].