sa国际传媒

Skip to content
Join our Newsletter

As Musk is learning, content moderation is a messy job

Now that he鈥檚 back on Twitter, neo-Nazi Andrew Anglin wants somebody to explain the rules.
20221201191252-63894cf6821cf083b8286aa0jpeg
FILE - Elon Musk, Tesla CEO, attends the opening of the Tesla factory Berlin Brandenburg in Gruenheide, Germany, March 22, 2022. Musk said during a presentation Wednesday, Dec. 1, 2022, that his Neuralink company is seeking permission to test its brain implant in people soon. Musk鈥檚 Neuralink is one of many groups working on linking brains to computers, efforts aimed at helping treat brain disorders, overcoming brain injuries and other applications. (Patrick Pleul/Pool via AP, File)

Now that he鈥檚 back on Twitter, neo-Nazi Andrew Anglin wants somebody to explain the rules.

Anglin, the founder of an , was reinstated Thursday, one of many previously banned users to benefit from an . The next day, , the rapper formerly known as Kanye West, after he posted a swastika with a Star of David in it.

鈥淭hat's cool," Anglin tweeted Friday. 鈥淚 mean, whatever the rules are, people will follow them. We just need to know what the rules are.鈥

Ask . Since the world鈥檚 richest man paid $44 billion for Twitter, the platform has struggled to define its rules for misinformation and hate speech, issued conflicting and contradictory announcements, and failed to full address what researchers say is a troubling rise in hate speech.

As the 鈥 鈥 may be learning, running a global platform with nearly 240 million active daily users requires and often demands imperfect solutions to messy situations 鈥 tough choices that must ultimately be made by a human and are sure to displease someone.

A self-described , has said he wants to make Twitter a global digital . But he also said he wouldn't make major decisions about content or about restoring banned accounts before setting up a 鈥 鈥 with diverse viewpoints.

He soon changed his mind after polling users on Twitter, and offered reinstatement to a long list of formerly banned users including , Ye, the satire site The Babylon Bee, the comedian Kathy Griffin and Anglin, the neo-Nazi.

And while Musk's own suggested he would allow all legal content on the platform, Ye's banishment shows that's not entirely the case. The swastika image posted by the rapper falls in the 鈥渓awful but awful鈥 category that often bedevils content moderators, according to Eric Goldman, a technology law expert and professor at Santa Clara University law school.

While Europe has requiring social media platforms to create policies on misinformation and hate speech, Goldman noted that in the U.S. at least, loose regulations allow Musk to run Twitter as he sees fit, despite his inconsistent approach.

鈥淲hat Musk is doing with Twitter is completely permissible under U.S. law,鈥 Goldman said.

Pressure from the EU may force Musk to lay out his policies to ensure he is complying with the new law, which takes effect next year. Last month, a senior EU official that Twitter would have to improve its efforts to combat hate speech and misinformation; failure to comply could lead to huge fines.

In another confusing move, Twitter announced in late November that it would . Days later, it posted an update claiming that 鈥淣one of our policies have changed.鈥

On Friday, Musk revealed what he said was the inside story of Twitter's decision in 2020 to limit the spread of a New York Post story about Hunter Biden's laptop.

Twitter on its platform, citing concerns that it contained material obtained through computer hacking. That decision was reversed after it was by then-Twitter CEO Jack Dorsey. Facebook also took actions to limit the story's spread.

The information revealed by Musk included Twitter's decision to delete a handful of tweets after receiving a request from Joe Biden's campaign. The tweets included nude photos of Hunter Biden that had been shared without his consent 鈥 a violation of Twitter's rules against .

Instead of revealing nefarious conduct or collusion with Democrats, Musk's revelation highlighted the kind of difficult content moderation decisions that he will now face.

鈥淚mpossible, messy and squishy decisions鈥 are unavoidable, according to Yoel Roth, Twitter's former head of trust and safety who

While far from perfect, the old Twitter strove to be transparent with users and steady in enforcing its rules, Roth said. That changed under Musk, he told a Knight Foundation forum this week.

鈥淲hen push came to shove, when you buy a $44 billion thing, you get to have the final say in how that $44 billion thing is governed,鈥 Roth said.

While much of the attention has been on Twitter鈥檚 moves in the U.S., the cutbacks of content-moderation workers is affecting other parts of the world too, according to activists with the #StopToxicTwitter campaign.

鈥淲e鈥檙e not talking about people not having resilience to hear things that hurt feelings,鈥 said Thenmozhi Soundararajan, executive director of Equality Labs, which works to combat caste-based discrimination in South Asia. 鈥淲e are talking about the prevention of dangerous genocidal hate speech that can lead to mass atrocities.鈥

Soundararajan's organization sits on Twitter鈥檚 Trust and Safety Council, which hasn鈥檛 met since Musk took over. She said 鈥渕illions of Indians are terrified about who is going to get reinstated,鈥 and the company has stopped responding to the group鈥檚 concerns.

鈥淪o what happens if there鈥檚 another call for violence? Like, do I have to tag Elon Musk and hope that he鈥檚 going to address the pogrom?鈥 Soundararajan said.

Instances of hate speech and racial epithets as some users sought to test the new owner's limits. The number of tweets containing hateful terms continues to rise, according to a report published Friday by the Center for Countering Digital Hate, a group that tracks online hate and extremism.

Musk has said Twitter has reduced the spread of tweets containing hate speech, making them harder to find unless a user searches for them. But that failed to satisfy the center's CEO, Imran Ahmed, who called the rise in hate speech a 鈥渃lear failure to meet his own self-proclaimed standards."

Immediately after Musk鈥檚 takeover and the of , researchers who previously had flagged harmful hate speech or misinformation to the platform .

Responsiveness has improved somewhat. Jesse Littlewood, vice president for campaigns at Common Cause, said his group reached out to Twitter last week about a tweet from U.S. Rep. Marjorie Taylor Greene that alleged election fraud in Arizona. after she was kicked off Twitter for spreading COVID-19 misinformation.

This time, Twitter was quick to respond, telling Common Cause that the tweet didn't violate any rules and would stay up 鈥 even though Twitter requires the or of content that spreads false or misleading claims about election results.

Twitter gave Littlewood no explanation for why it wasn鈥檛 following its own rules.

鈥淚 find that pretty confounding,鈥 Littlewood said.

Twitter did not respond to messages seeking comment for this story. Musk has defended the platform's since he took over, and said mistakes will happen as it evolves. 鈥淲e will do lots of ," he tweeted.

To Musk鈥檚 many online fans, the disarray is a feature, not a bug, of the site under its new ownership, and a reflection of the free speech mecca they hope Twitter will be.

鈥淚 love Elon Twitter so far," tweeted a user who goes by the name Some Dude. 鈥淭he chaos is glorious!鈥

David Klepper And Matt O'brien, The Associated Press