fbpx

Hate speech in your social media feed? Try these tips

Social media sites are tools with seemingly infinite potential. In the right hands, they can be a space of community connections and great comfort, but they also can be used to promote messages of hate.
[additional-authors]
October 27, 2016

Social media sites are tools with seemingly infinite potential. In the right hands, they can be a space of community connections and great comfort, but they also can be used to promote messages of hate.

[Digital hate: 1. Unfollow, then block. 

Twitter’s abuse guidelines are very careful to distinguish between “unwanted communication” or “something on the internet we disagree with,” and “online abuse.” They advise that if you find something you don’t like, your first step should be to unfollow the user; the next step is to block them. 

2. Challenge the haters.

Historian Jason Weixelbaum, a doctoral student at American University who has received anti-Semitic Twitter messages, said offenders rarely openly communicate their intentions. 

“Ask them a direct question: ‘What do you really want to do?’ They tend to run away because they know there are limits to free speech. If they start making violent threats, [social media] companies can stop them.” He also said that engaging directly with them allows social media companies to better see the threat and act on it.

3. Energize the good guys.

When Rabbi Susan Goldberg of Wilshire Boulevard Temple — who has a few hundred followers on Twitter — was attacked on the social media platform (see main story), she reported some of the abusive accounts to Twitter. But on the advice of some friends, she did not block them. “Blocking them would give them energy,” she said. 

Instead, the rabbi took her experience to Facebook, where her network is larger, encouraging her followers to “lend your voice at this important moment and use the hashtag #‎weveseenthisbefore to stand against hate towards all.” Many wrote messages of support, tagging her Twitter account and populating it with supportive tweets, thereby pushing the hateful comments farther down the feed.

4. Call the cops.

Policies for Twitter and Facebook include language urging users to report violent threats to local law enforcement, advising them to take screenshots of abusive messages or print them out, and to share with police to help explain the threat. 

After receiving general threats online, journalist Bethany Mandel said the local police department “is aware and drives by my house,” and that she has contact information for the local Homeland Security office, the county prosecutor and the hate crimes division. Still, she said, “there’s nothing anyone can do because there’s no specific threat.” 

5. Keep expectations realistic.

When it comes to Facebook, if you see a post that is not threatening or violent but you think violates the site’s policies, the company wants you to report it for review. Just don’t always expect the company to respond the way you’d like it to.

“Every day, people come to Facebook to connect with the people and issues they care about,” said Facebook spokesperson Andrea Saul in a phone interview with the Journal. “Given the diversity of the Facebook community, this means that sometimes people will share information that is controversial or offends others.”

Saul said that “there is no place for content encouraging violence, direct threats, terrorism or hate speech on Facebook,” but, given the massive amount of content involved, it’s impossible for the platform to catch everything. 

Rabbi Yonah Bookstein, the founder of Pico Shul and a longtime blogger, said there are more nuanced challenges as well. 

“[Facebook doesn’t] appreciate that ‘Zionism’ has become the new racist code word for ‘Jew.’ ‘Kill the Zionists’ might not mean to Facebook the same as ‘Kill the Jew,’” Bookstein said. “I’m trying to get them to understand that anti-Zionism is the new anti-Semitism; it’s virulent and leading to people getting murdered. But they don’t really see the bigger, broader thing.”

Rabbi Abraham Cooper, associate dean at the Simon Wiesenthal Center and its Museum of Tolerance, who has been involved in identifying online hate for decades, acknowledged that policing social media is a complicated picture. But, he said, Facebook had a “sense of social responsibility from the beginning.” 

Cooper said that “getting a black eye in the public domain” — having attention called to a platform’s shortcomings when it comes to policing hate and terrorism — could have a major impact on how social media companies do business. 

“If a site has the reporting feature, then use it as much as possible. The only way things are going to change is if the companies get the message that enough people find them offensive,” he said. 

And yet, before you mobilize your network of friends to campaign against a Facebook post, be aware that according to Facebook’s Community Standards, “The number of reports does not impact whether something will be removed. We never remove content simply because it has been reported a number of times.” 

“Now if I see something I don’t like, I don’t get as up-in-arms about it. No need to campaign: I report it and move on,” Bookstein said. 

“People feel like they’re fighting the fight for Israel from their computer but they’re not. It’s a waste of our time to campaign [to remove objectionable content]; we should spend it on other things — to spread God’s light; make the world a better place; raise money for the widows, orphans and the poor; mitzvah projects; be Jewish together; enhance Jewish life and our impact on the world.”

Did you enjoy this article?
You'll love our roundtable.

Editor's Picks

Latest Articles

More news and opinions than at a
Shabbat dinner, right in your inbox.

More news and opinions than at a Shabbat dinner, right in your inbox.

More news and opinions than at a Shabbat dinner, right in your inbox.