UN Human Rights Expert Urges States to Curb Intolerance Online

Following the shooting deaths of 11 worshippers at a synagogue in the eastern United States, a U.N. human rights expert urged governments on Monday to do more to curb racist and anti-Semitic intolerance, especially online.

“That event should be a catalyst for urgent action against hate crimes, but also a reminder to fight harder against the current climate of intolerance that has made racist, xenophobic and anti-Semitic attitudes and beliefs more acceptable,” U.N. Special Rapporteur Tendayi Achiume said of Saturday’s attack on a synagogue in Pittsburgh, Pennsylvania.

Achiume, whose mandate is the elimination of racism, racial discrimination, xenophobia and related intolerance, noted in her annual report that “Jews remain especially vulnerable to anti-Semitic attacks online.”

She said that Nazi and neo-Nazi groups exploit the internet to spread and incite hate because it is “largely unregulated, decentralized, cheap” and anonymous.

Achiume, a law professor at the University of California, Los Angeles (UCLA) School of Law, said neo-Nazi groups are increasingly relying on the internet and social media platforms to recruit new members.

Facebook, Twitter and YouTube are among their favorites.

On Facebook, for example, hate groups connect with sympathetic supporters and use the platform to recruit new members, organize events and raise money for their activities. YouTube, which has over 1.5 billion viewers each month, is another critical communications tool for propaganda videos and even neo-Nazi music videos. On Twitter, according to one 2012 study cited in the special rapporteur’s report, the presence of white nationalist movements on that platform has increased by more than 600 percent.

The special rapporteur noted that while digital technology has become an integral and positive part of most people’s lives, “these developments have also aided the spread of hateful movements.”

She said in the past year, platforms including Facebook, Twitter and YouTube have banned individual users who have contributed to hate movements or threatened violence, but ensuring the removal of racist content online remains difficult.

Some hate groups try to get around raising red flags by using racially coded messaging, which makes it harder for social media platforms to recognize their hate speech and shut down their presence.

Achiume cited as an example the use of a cartoon character “Pepe the Frog,” which was appropriated by members of neo-Nazi and white supremacist groups and was widely displayed during a white supremacist rally in the southern U.S. city of Charlottesville, Virginia, in 2017.

The special rapporteur welcomed actions in several states to counter intolerance online, but cautioned it must not be used as a pretext for censorship and other abuses. She also urged governments to work with the private sector — specifically technology companies — to fight such prejudices in the digital space.

         

leave a reply: