Social media giants Facebook, Twitter, Instagram, YouTube and TikTok “fail” to act on most anti-Muslim posts, according to new research released Thursday by the Center for Countering Digital Hate (CCHR).
The five social media companies collectively failed to respond to about 89% of anti-Muslim posts reported between February 15 and March 9, according to the international nonprofit, which has offices in Washington, D.C. and in London.
The CCDH’s “Protection Fault” report comes amid ongoing debates over whether and how social media platforms should be regulated in the future. The platforms have been largely self-regulating until now, but U.S. lawmakers have increasingly called for changes, and the European Union recently moved forward with legislation that would require social media platforms to fight the spread of the virus. disinformation.
The report begins with a reference to the support Meta, Twitter and Google have expressed for Christchurch’s appeal following the 2019 mass shootings at mosques in Christchurch, New Zealand. The shootings, which were broadcast live, prompted the launch of the Christchurch Appeal, which aims to stamp out online content depicting terrorism and violent extremism.
“Once again, their press releases turn out to be nothing more than empty promises,” the CCHR said of the companies’ previous support for the Christchurch call.
For the study released on Thursday, the CCDH said it identified 530 posts in total across the five platforms that “contain disturbing, bigoted and dehumanizing content that targets Muslims through racist caricatures, conspiracies and misrepresentations. “. The posts have been “viewed at least 25 million times,” the CCDH said.
The organization said it reported all posts to social media companies using each company’s “specific reporting tools”. The CCDH said “many” posts were “easily identifiable” as “abusive content”, but “there was still inaction”.
Once the posts were identified, the CCDH said its auditors returned and “verified each post and recorded any action taken” by the companies.
Of the 125 posts CCHR reported to Facebook, seven, or about 5.6%, were deleted. CCHR said three of the 105 posts it reported to Twitter had been deleted, or about 2.9%.
Instagram and TikTok have taken further action against related content and profiles, according to the CCHR. Of the 227 posts reported to Instagram, 12 posts were deleted and 20 accounts were deactivated, resulting in a response rate of 14.1%. Twelve of the 50 posts reported to TikTok were deleted and six accounts were deactivated, leaving TikTok with a 36% response rate.
Newsweek contacted Twitter, Facebook, Instagram and TikTok for comment.
Meanwhile, CCHR said YouTube took no action on the 23 posts CCHR reported to the company. When contacted for comment on Friday, YouTube disputed the report’s findings, saying in a statement that it had taken action on some of the reported content.
“YouTube’s hate speech and harassment policies set out clear guidelines prohibiting content that incites violence or hatred against individuals or groups where religion, ethnicity, or other protected attributes are targeted,” YouTube said. Newsweek. “Of the videos reported to us by the CCHR, five were removed for violating our hate speech policies and eight were subject to age restrictions.”
YouTube has also made Newsweek to its Community Guidelines, which “make it clear that we do not allow hate speech or harassment on YouTube”, and its Hate Speech Policy, which “specifically prohibits content that promotes violence or hatred against individuals or groups based on attributes such as their immigration status, nationality or religion”.
YouTube did not say which of the 23 flagged posts it removed or restricted, or when the platform responded to the flagged posts. Newsweek has contacted YouTube for further comment.
The CCHR said Newsweek he was unaware that action had been taken on one of the posts he flagged for YouTube.
CCDH CEO Imran Ahmed said Newsweek YouTube’s actions likely took place after the CCHR’s last audit for its report.
“They didn’t react quickly,” Ahmed said. “The salient point here is that it’s good that they have acknowledged that these videos violate their standards and have taken action. And that begs the question of why they didn’t do it sooner.”
According to a CCDH article on its website about this week’s study, the collective findings “echo the CCDH’s previous reports on failure to act.”
This year’s report concludes with several “calls to action”, which include social media companies hiring and training content moderators, specific actions recommended to remove anti-Muslim pages and hashtags. , and a series of suggested legislative strategies.