Buffalo Mass Shooting Puts Spotlight on the Good, Bad and Ugly of Social Media

Buffalo Mass Shooting Puts Spotlight on the Good, Bad and Ugly of Social Media


This Story's Health Experts


The 18-year-old who allegedly shot and killed 10 people in a racially motivated attack in a Buffalo supermarket last week studied the 2019 Christchurch mass shootings in New Zealand, which were also live-streamed, on an easily available online forum.

Then, he posted a 180-page manifesto just days before the massacre, which he live-streamed on Twitch, a popular gaming platform. While Twitch removed the video in less than two minutes, versions of it shared by others, including followers of the hate-ridden platform 4chan, remain available to this day.

While major social media players like Facebook and Twitter spend a lot of time and resources at least attempting to police their sites, 4chan is among a large number of sites where racist and anti-Semitic content is shared widely with very little oversight – almost anything goes for tens of millions of people.

Javeed Sukhera, MD, PhD, FRCPC, psychiatrist-in-chief of the Hartford HealthCare Behavioral Health Network’s Institute of Living and chair of psychiatry for Hartford Hospital, spends significant time on Twitter and other social media platforms and is concerned about the role they are playing in mass shootings. He said there are two sides to the debate.

“Social media creates a space that can add fuel to the fire,” Sukhera said. “But it can also help put the fire out and make people feel more connected, such as during the pandemic. And even with the Buffalo shooting there were a lot of positive messages, encouraging people not to be scared, to live their lives, to be themselves. But, unfortunately, there is a much darker side. These are crimes that are designed to get attention and amplify racist or antisemitic views, and social media is doing just that.”

Natchaug Hospital Associate Medical Director Paul Weigle, MD, serves as chair of the Media Committee for the American Academy of Child & Adolescent Psychiatry, and on the scientific advisory board for Children & Screens: Institute for Digital Media and Child Development. He said the internet and social media have enabled hate groups to grow and can be a breeding ground for copycat killers.

“Internet forums have provided a collective identity, easy access and legal freedoms to hate groups,” Weigle said. “Members of hate groups have been known to advertise and recruit via the vastly popular, but relatively unmonitored online gaming chats in shooter games, or on Discord [a popular online gaming chat platform]. They tend to start small, normalize racial slurs in a private team chat. To those who respond positively, they may send links to sites on Twitter, propaganda videos or hate group websites that appear innocuous, use ambiguous group names, but disseminate extremist material leading to potential indoctrination.”

Weigle said it’s not hard to find extremist content, even if you aren’t looking for it. Getting news via social media, which many youths do, is a risk factor for receiving polarizing news stories and extremist material.

“Search engines make it very easy to find hate material online, even if it was not what the searcher was looking for,” Weigle said. “For example, YouTube algorithms tend to recommend increasingly extreme content. YouTube has responded by filtering, removing tens of thousands of videos per month based on hate speech. But it’s like playing whack-a-mole.”

While it’s true social media platforms have learned to remove violent, extremist content faster over the past few years, it’s debatable whether they have done enough.

In the wake of the shooting in New Zealand, social platforms around the world joined the Christchurch Call to Action initiative, agreeing to work together to combat terrorism and violent, extremist content. They are using technology that can flag inappropriate content and, once identified, have it taken down quickly.

What else can be done? Sukhera and Weigle offer these solutions:

  • Shooters are often motivated in part by fame, and they learn from one another, so it’s important that news outlets and social media sites avoid publication of a shooter’s name, photo, or works (such as a manifesto). Videos featuring violence of any kind must be taken down as soon as possible to limit their spread and influence. It’s encouraging that major social media sites responded more rapidly than in the past, although an even more rapid response is needed.
  • Social media sites should continue to fix algorithms to stop promoting controversial and inflammatory content; monitor posts for content that celebrates or encourages violence or perpetrators of mass violence, filter the post, and consider suspending the individual who posted it; continue to remove videos depicting violent acts as quickly as possible.
  •  Lawmakers can help create standards and put them into law, so that even the wild west of social media companies must comply with protocols and policies.

The suspect in the Buffalo shooting was 18 years old. What can we do to stop young copycats? Dr Weigle said:

  • Discuss racial and religious bias with children and teens, and expose kids to peers of different groups, whether in person or in shows, movies and books.
  • Be aware of online contacts your kids are having on games: kids and younger teens should be playing in public areas of the home without headphones so online interactions can be best monitored.
  • Talk with kids about online activities and contacts, and approach them with a curious rather than judgmental attitude to keep the dialogue going.
  • Report any inappropriate content or behavior to the game developer or website. Links can be found on https://www.end-violence.org/safeonlinecovid.
  • If you feel unable to help a teen involved in a hate group, or are concerned about your teen’s safety, don’t hesitate to seek help from a qualified mental health professional.

Loading...