Stuck in echo chambers: how hate feeds on itself online

Twitter, Google, YouTube and Facebook are designed to feed you the content you want to see, no matter how disturbing

It’s not often parents of mass murderers speak publicly, much less ask for understanding. It took Sue Klebold, mother of Columbine killer Dylan Klebold, more than 16 years before she published A Mother’s Reckoning: Living In The Aftermath Of Tragedy. The moving memoir, which includes excerpts from her diaries since the 1999 Columbine massacre, challenges us to rethink our views on perpetrators of heinous crimes.

I asked Klebold for an interview not long ago. She politely declined. 

“I rarely accept requests to speak privately with individuals,” she told me. She says that decision “is based entirely on my need to maintain self-care.” And understandably so. Opening up about the horrific acts of a loved one is painful, especially when there is no denying their devastating criminal impact.

Klebold has courageously reached this point, becoming an advocate for mental health and suicide prevention. I am not sure, however, about Raymond Bissonnette, the father of Alexandre Bissonnette, the convicted gunman in the 2017 Quebec City mosque massacre. 

The elder Bissonnette recently publicly asked Prime Minister Justin Trudeau to stop referring to his son’s actions as an act of domestic terrorism. (He has also called the 25-year sentence handed his son “very harsh” and “political.”) He does not believe the terrorism description is fair. According to Bissonnette, there was no “particular ideology” behind his son’s rampage.

He admits his son’s penchant for “order” – among other “obsessive tendencies” noted at his sentencing hearing – but nothing extreme enough to suggest a willingness to engage in terrorism. 

Bissonnette erroneously believes the mosque attack was an aberration from the “true” person his son really is. 

It’s true that Bissonnette technically was convicted for murder, not terrorism, but Bissonnette’s attack was an act of terrorism. It was motivated by the desire to “save” Canadians from Islam and was underscored by his baseless conviction that there was at least one terrorist at the Islamic Cultural Centre of Quebec City when he unleashed his attack, killing six worshippers and injuring six others. 

Misled by this idea, Bissonnette undertook a hateful mission associated with one of the tenets of white supremacy: wherever they are, Muslims cannot be trusted. They were reduced by Bissonnette to enemies, nothing more than stereotypes based on prejudice. Extremism is always blind. 

Though Bissonnette did not himself espouse “white supremacist or neo-Nazi ideology,” according to police, I do wonder how much the design of mainstream platforms like Twitter, Google, YouTube and Facebook, which he obsessively visited to check the sites of far-right conspiracy theorists, among others, played a role in him identifying with it at all. 

YouTube, for one, gives you more of the same kind of content, however disturbing, the more you click on it. 

YouTube actually recommends increasingly inflammatory content the more you click on it, something scholar Zeynep Tufekci and others have noted. It’s almost a test to see how hardcore or squeamish you are. It’s an attempt to keep you on the platform longer, while YouTube makes money from the greater number of advertisements you’re watching in the process. 

Harvard professor Shoshana Zuboff calls this “surveillance capitalism,” which feeds you the same content algorithms are telling money-makers you want to see.

Perhaps we don’t have to worry about this when it comes to those who enjoy watching cat videos. But this kind of exposure is rather harrowing to think about in the case of those like Bissonnette who regularly engage with hateful content and are pulled evermore in that direction.

In effect they shield themselves against content that would otherwise challenge their opinions. They are stuck in echo chambers, where their myopic views are upheld and parroted back to them by others. 

In its mild form this is anti-intellectual, rejecting discussion that considers diverse viewpoints. In the extreme, it spills over into real life, and killers like Bissonnette emerge. They are radicalized in a manner that might be described as invisible – their mind absorbed in the cyberworld. By the time their parents see how radicalized their child has become, it’s too late. A killer has taken innocent lives and traumatized a community.

Hateful content on mainstream platforms is unlikely to disappear any time soon. There’s a lot of it, and it’s growing the more it’s shared.

Content that, over time, polarizes people while discouraging constructive dialogue and debate has no value. 

Platforms that operate in ways that, inadvertently or not, promote bigotry can, and should, be sanctioned – limiting or blocking them from their respective digital spheres. It’s not a silver bullet against extremism, but it can prevent users from being overwhelmed by hateful content suggested to them by impersonal algorithms. 

This is especially important with respect to young people who spend hours online. In order for them to genuinely learn, they must have the opportunity to engage with and assess competing points of view. And it’s my opinion that they become better people for it because in considering difference, we are forced to reckon with the other’s humanity. They are not just caricatures, which is what’s happening pervasively online and at many far-right demonstrations (and, yes, even counter-demonstrations) spilling onto the street. Ignorance reigns.

One of Bissonnette’s former teachers, shocked by the news of his terrorism, has said the Alexandre she knew was “not a monster. I liked walking on the same planet as him.”

It’s the hatred that consumed Bissonnette. If we fail to recognize the humanity even in those we regard as our enemies, we place ourselves at risk of becoming like him. 


Brand Voices

Leave a Reply

Your email address will not be published. Required fields are marked *